Sign In to Follow Application
View All Documents & Correspondence

Calibration Of Inertial Measurement Unit

Abstract: The invention provides a method and system for calibrating an inertial measurement unit (IMU) of an electronic device. The method includes receiving a reference point of an object in an environment, determining geographic coordinates of the reference point and determining relative location of the electronic device relative to the reference point. Further, the method includes calculating geographic coordinates of the electronic device using the geographic coordinates of the reference point and the relative location of the electronic device, and calibrating the IMU using the geographic coordinates of the electronic device. FIG. 6

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
20 March 2013
Publication Number
16/2016
Publication Type
INA
Invention Field
PHYSICS
Status
Email
patent@brainleague.com
Parent Application
Patent Number
Legal Status
Grant Date
2021-06-07
Renewal Date

Applicants

Samsung India Electronics Pvt Ltd.
Samsung India Electronics Pvt. Ltd. Logix Cyber Park Plot No C-28 & 29, Tower D Noida Sec - 62

Inventors

1. Dr. Sumit Mediratta
Near Police Chowky, Physical College Road, Shivpuri, Madhya Pradesh Pin: - 473551
2. Saurabh Tyagi
H/No - 6/159, Sector – 2, Rajendra Nagar, Ghaziabad (Pin code - 201005), Uttar Pradesh (INDIA)

Specification

FIELD OF INVENTION
[001] The present invention relates to calibration systems, and more
particularly to a mechanism for calibrating an inertial measurement unit of
a mobile phone using stereoscopic and other ranging techniques.
BACKGROUND OF INVENTIO5 N
[002] The accurate ‘estimation’ and/or ‘calibration’ of location of
an object involve significant challenges. Many different methods and
systems are proposed to estimate geographical coordinates (or location) of
10 the objects such as mobile phones. The conventional system and methods
include for example, but not limited to, global positioning system (GPS),
Wi-Fi positioning system (WPS), inertial measurement unit (IMU), cell
tower triangulation (CTT), two cell towers with directional antenna, hybrid
positioning systems, location-based services (LBS), and the like.
15 [003] The accuracy of estimating location of the object using the
GPS can be poor in urban areas (or indoor areas), as the GPS view of
satellites can be mostly blocked. Unlike GPS, the WPS rely on presence of
access points (AP) that may include very small communication range. For
accurate estimation of location of the objects in open areas many AP’s
20 needs to be configured and maintained proximity to each other, which may
increase the overall system cost. The CTT may needs connection with a
plurality of towers to accurately estimate the location of the objects. The
IMUs with the conventional calibration/recalibration methods can include
drift errors and can be commonly used with additional technologies (such
25 as GPS). The use of such additional techniques can increase the overall
system cost. Moreover, the accuracy of calibrated IMU can get limited by
the accuracy of the primary technology used for calibration. Further, the
3/28
two cell towers with directional antennae and other methods can involve
substantially similar challenges.
[004] Though the conventional systems and methods are effective to
a degree in estimating geographical coordinates of the objects but, include
both advantages and disadvantages in terms of accuracy, time, cost5 ,
performance, reliability, effectiveness, and the like.
OBJECT OF INVENTION
10 [005] The principal object of the embodiments herein is to provide a
method and system for calibrating an inertial measurement unit (IMU).
[006] Another object of the invention is to provide a method and
system for accurately estimating geographic coordinates of an object.
[007] Another object of the invention is to provide a mechanism
15 using stereoscopic techniques to determine geographic coordinates of an
object.
SUMMARY
[001] Accordingly the invention provides a method for calibrating
an inertial measurement unit (IMU) of an electronic device. The method
includes receiving a reference point of an object in an environment,
determining geographic coordinates of the reference point and determining
relative location of the electronic device relative to the reference point.
Further, the method includes calculating geographic coordinates of the
electronic device using the geographic coordinates of the reference point
and the relative location of the electronic device, and calibrating the IMU
using the geographic coordinates of the electronic device.
4/28
[002] Furthermore, the method includes implementing stereographic
technique, light detection and ranging (LIDAR) technique, sound
navigation and ranging (SONAR) technique, and the like in the electronic
device to determine the relative location of the electronic device.
Furthermore, the method includes receiving an image of the environmen5 t
using the electronic device. The image of the environment includes
recording a stereoscopic image of the environment including the reference
point using the electronic device. Furthermore, the method includes
identifying the reference point of the object in the image of the
10 environment and determining the geographic coordinates of the reference
point of the object using the electronic device, image processing
techniques, wireless communication, third-party sources, and the like.
[003] Accordingly the invention provides a system for calibrating
an inertial measurement unit (IMU). The system includes an electronic
15 device is configured to receive a reference point of an object in an
environment, determine geographic coordinates of the reference point, and
determine relative location of the electronic device relative to the reference
point. Further, the electronic device can be configured to calculate
geographic coordinates of the electronic device using the geographic
20 coordinates of the reference point and the relative location of the electronic
device, and calibrate the IMU using the geographic coordinates of the
electronic device.
[004] Furthermore, the electronic device is configured to implement
stereographic technique, light detection and ranging (LIDAR) technique,
sound navigation and ranging (SONAR) technique, and the like to
determine the relative location of the electronic device. Furthermore, the
electronic device is configured to implement a stereo camera to capture a
stereoscopic image of an environment and identify the reference point of
the object in the image of the environment. Furthermore, the electronic
5/28
device is configured to determine the geographic coordinates of the
reference point of the object using the electronic device, image processing
techniques, wireless communication, third-party sources, and the like.
[005] These and other aspects of the embodiments herein will be
better appreciated and understood when considered in conjunction with the
following description and the accompanying drawings. It should be
understood, however, that the following descriptions, while indicating
preferred embodiments and numerous specific details thereof, are given b5 y
way of illustration and not of limitation. Many changes and modifications
may be made within the scope of the embodiments herein without departing
from the spirit thereof, and the embodiments herein include all such
modifications.
10
BRIEF DESCRIPTION OF FIGURES
[006] This invention is illustrated in the accompanying drawings,
throughout which like reference letters indicate corresponding parts in the
15 various figures. The embodiments herein will be better understood from the
following description with reference to the drawings, in which:
[007] FIG. 1 is a schematic representation of an environment in
which various embodiments of the present invention operates, according to
embodiments as disclosed herein;
20 [008] FIG. 2 illustrates generally, among other things, electronic
device as described in the FIG. 1, according to embodiments as disclosed
herein;
[009] FIG. 3 is a graph illustrates generally, projection of reference
point in left and right images as described in the FIG. 2, according to
25 embodiments as disclosed herein;
6/28
[0010] FIG. 4 is a diagram illustrates generally, an exemplary
reference location whose geographic coordinates may not be directly
communicated to the electronic device but can be obtained using other
methods, according to embodiments as disclosed herein;
[0011] FIG. 5 is a diagram illustrates generally, an exemplar5 y
reference location whose geographic coordinates can be communicated to
the electronic device, according to embodiments as disclosed herein;
[0012] FIG. 6 is a flow diagram illustrates generally, a method for
calibrating IMU of the electronic device, according to embodiments as
10 disclosed herein; and
[0013] FIG. 7 depicts a computing environment implementing the
application, in accordance with various embodiments of the present
invention.
7/28
DETAILED DESCRIPTION OF INVENTION
[0014] The embodiments herein and the various features and
advantageous details thereof are explained more fully with reference to the
non-limiting embodiments that are illustrated in the accompanyin5 g
drawings and detailed in the following description. Descriptions of wellknown
components and processing techniques are omitted so as to not
unnecessarily obscure the embodiments herein. The examples used herein
are intended merely to facilitate an understanding of ways in which the
10 embodiments herein can be practiced and to further enable those skilled in
the art to practice the embodiments herein. Accordingly, the examples
should not be construed as limiting the scope of the embodiments herein.
[0015] The embodiments herein achieve a method and system for
calibrating an inertial measurement unit (IMU) of an electronic device. The
15 electronic device includes a stereo camera to record a stereo image of an
environment. The electronic device can be configured to identify a
reference point of an object in the environment and determine the
associated geographic coordinates of the reference point. The relative
location of the electronic device can be determined using the stereo vision
20 or other ranging techniques used by the electronic device. Further, the
system and method includes calculating the geographic coordinates of the
electronic device using the determined geographic coordinates of the
reference point and the determined relative location of the electronic
device. The electronic device can be configured to use the calculated
25 geographic coordinates to calibrate/recalibrate the IMU.
[0016] Generally, the IMU is an electronic device or module
configured to measure and report velocity, orientation, location, direction,
gravitational forces, and the like properties of an object. It uses a
combination of components such as for example, but not limited to,
8/28
accelerometers, gyroscopes, magnetometers, and the like to measure the
properties of the object. Almost all the components of the IMU can be
commonly implemented or included into the electronic devices (such as
smart phones). The proposed invention uses already existing components
and methods (such as the IMU and stereoscopic techniques), which ca5 n
rapidly become common modules in the electronic devices for providing an
accurate estimation of the location information. Further, the IMU can be
included or coupled to different objects such as for example, but not limited
to, cars, aero planes, trucks, any electronic device, and the like.
10 [0017] The method and system disclosed herein is simple, dynamic,
robust, and reliable to accurately estimate the geographic locations of an
object and perform associated calibrations and recalibrations. Unlike
conventional system, the system and method allows the IMU to turn-on/off
the power based on the requirement and usage. Such power turning-on/off
15 capabilities allow users to efficiently calibrate and recalibrate the IMU.
Further, the system may not integrate or communicate with other
technologies such as for example, but not limited to, global positioning
system (GPS), Wi-Fi positioning system (WPS), cell tower triangulation
(CTT), and the like to estimate the location of the objects, which
20 significantly reduces the overall system cost. The system uses the
stereoscopic or other ranging techniques (such as light detection and
ranging (LIDAR)) to increase the accuracy of estimating the geographic
coordinates and efficiently calibrate and/or recalibrate the IMU.
Furthermore, the proposed system and method can be implemented using
25 the existing infrastructure, components, and modules, and may not require
extensive set-up or instrumentation.
[0018] Referring now to the drawings, and more particularly to
FIGS. 1 through 7, where similar reference characters denote corresponding
9/28
features consistently throughout the figures, there are shown preferred
embodiments.
[0019] FIG. 1 is a schematic representation of an environment 100,
in which various embodiments of the present invention operates, according
to embodiments as disclosed herein. The environment 100 includes a5 n
electronic device 102 and a reference location 104. In an embodiment, the
electronic device 102 described herein can include for example, but not
limited to, portable electronic device, desktop computer, laptop, personal
digital assistance (PDA), smart phones, tablet, communicator, processor,
10 microcontroller, or any other electronic device. The electronic device 102
described herein can be configured to include or implement an inertial
measurement unit (IMU). Further, in an embodiment, the IMU can be
implemented, included, and/or coupled to objects such as for example, but
not limited to, car, air plane, ships, and the like. The components used to
15 implement the IMU can include for example, but not limited to,
accelerometers, gyroscopes, magnetometers, and a combination thereof.
[0020] In an embodiment, the reference location 104 described
herein can include for example, but not limited to, a building, a shopping
mall, a shop, a commercial place, a landmark, or any other open or close
20 location. In an embodiment, the electronic device 102 can be configured to
identify a reference point 106 (also referred as point of interest or interested
object position) in the reference location 104. In an embodiment, a user of
the electronic device can take an image of the reference location 104 and
selects a region of the image to indicate the reference point 106 in the
25 electronic device 102. Alternatively, the reference location 104 can be
scanned using the electronic device 102 and the reference point 106 can be
automatically identified by the electronic device 102. Further, various
operations performed by the electronic device 102 to identify the reference
10/28
point 106 and further process the information is described in conjunction
with the FIGS. 2 through 6.
[0021] In an embodiment, the geographic coordinates of the
reference point 106 can be communicated to the electronic device 102. In
an embodiment, the reference point 106 can be a well known landmar5 k
whose geographic coordinates can be easily communicated to the electronic
device 102. An exemplary landmark and associated operations performed
by the electronic device 102 is described in conjunction with the FIG. 5.
[0022] In an embodiment, the reference point 106 may not be a well
10 known location or whose geographic coordinates may not be
communicated to the electronic device 102. In such scenario, the electronic
device 102 can be configured to interact with various other sources over a
communication network to determine the geographic coordinates of the
reference point 106. The communication network described herein can
15 represent a number of distinct computer networks or communication
mediums for example, but not limited to, wireless network, wire-line
network, public network such as the Internet, local area network, wide area
network, personal area network, private network, cellular network, global
system for mobile communication network (GSM), combination thereof, or
20 any other network. An exemplary landmark and associated operations
performed by the electronic device 102 is described in conjunction with the
FIG. 4.
[0023] In the FIG. 1, the electronic device 102 shown is a smart
phone and the reference location 104 shown is any location whose
25 coordinates can be communicated to the electronic device 102. Further, it is
understood that another exemplary environment is not limited thereto.
[0024] FIG. 2 illustrates generally, among other things, the
electronic device 102 as described in the FIG. 1, according to embodiments
as disclosed herein. The FIG. 2 describes an exemplary electronic device
11/28
102 including camera flash 202 and a stereo camera 204. The electronic
device 102 can be configured to implement stereoscopic techniques (also
referred as stereoscopic or 3D imaging), light detection and ranging
(LIDAR), sound navigation and ranging (SONAR), and the like to
accurately estimate the geographic coordinates of the electronic device 1025 .
The stereoscopic techniques described herein can be any stereo technique
known or currently unknown in the art. The stereo camera 204 can be
configured to include two or more lenses with a separate image sensor or
film frame for each lens. Generally, the distance between the lenses in the
10 stereo camera 204 (the intra-axial distance) is about the distance between
one's eyes (known as the intra-ocular distance).
[0025] The electronic device 102 can be configured to use the stereo
camera 204 to capture three-dimensional images of the reference location
104. The electronic device 102 can be configured to automatically identify
15 the reference point 106 using the image processing techniques known or
currently unknown in the art. Alternatively, the electronic device 102 can
use the touch input, gesture input, or any other input provided by the user to
identify the reference point 106. Further, the electronic device 102 can be
configured to use a stereo vision technique, LIDAR, SONAR, and the like
20 to determine the relative location of the electronic device 102 using the
three-dimensional image of the reference location 104. The relative
location of the electronic device 102 with respect to the chosen reference
point 106 can be calculated in terms of (Dx, Dy, Dz), where Dx, Dy, and Dz
represent the relative displacements along the x, y, and z directions
25 respectively, represented in centimeter-gram-second (CGS) or meterkilogram-
second (MKS) units.
[0026] In an embodiment, the electronic device 102 can be
configured to determine the relative direction of the electronic device 102
using the stereo-vision technique is disclosed. In an embodiment, the
12/28
stereo-vision technique described herein can extract the 3D information
from digital images. The stereo vision uses the stereo camera to capture two
images (left and right) simultaneously of the scene. A stereo vision
algorithm can be used to compare information about scene from two
images. Corresponding pixels can be identified in the left and right image5 s
that can represent the same reference point 106 in both the images. In an
embodiment, the equation Error! Reference source not found.used to
calculate depth (distance) of the reference point from the stereo camera 204
is
10 .
[0027] Where, Z represent the depth (or distance), f represent the
focal length for both left and right lenses, T represent the distance between
centers of projection of left ( Ol )and right (Or ) lenses, xl and xr represent
the horizontal positions of the projection points in the left and right image
15 respectively. The graphical representation of the equation is described in
conjunction with the FIG. 3.
[0028] In an embodiment, depth for each pixel visible in both left
and right images are calculated using the above equation. Further,
stereoscopic mechanism known in the art can be used to generate a depth
20 (distance) map of the scene in front of the camera based on the calculated
pixels. Furthermore, from the depth data for the identified reference point
106 of the reference location 104 and the relative direction of the electronic
device 102, the relative location (Dx, Dy, and Dz) of the electronic device
102 can be calculated. The relative location of the electronic device 102 can
25 be calculated in terms of the CGS or MKS units with respect to the
identified reference point 106 of the reference location 104.
[0029] FIG. 3 is a graph 300 illustrates generally, projection of the
reference point in the left and right images as described in the FIG. 2,
13/28
according to embodiments as disclosed herein. The graphical representation
of the equation to calculate the depth (distance) of the reference point 106
from the stereo camera 204 is shown in the FIG. 3. The Ol
represents the
center of projection of the left stereo camera and Or
represents the center of
projection of the right stereo camera. The xl and xr represents horizonta5 l
position of projection points in the left and right image. The xl represents
the image plane of left camera and xr represents the image plane of right
camera. The P represents the reference point 106. The electronic device 102
can be configured to determine the corresponding image coordinates of the
reference point (P) in both the left and right images. The xl and xr 10 represent
the projection of same reference point in the left and right image. The T
represents the relative distance between the centers of projection of the left
and right cameras. The disparity (i.e., the difference in the position between
the corresponding points in the two images) can be calculated using
15 equation d= .
[0030] FIG. 4 is a diagram 400 illustrating generally an exemplary
reference location 402 whose geographic coordinates may not be
communicated to the electronic device 102, according to certain subset of
embodiments as disclosed herein. The reference location 402 described
20 herein may be a well known location but, the geographic coordinates may
not be able to communicate to the electronic device 102. In such scenario,
the electronic device 102 can be configured to interact with various other
sources over the communication network to determine the geographic
coordinates. The FIG. 4 shows a locally well-known reference location 402
25 (including a XYZ factory). The various operations performed by the
electronic device 102 in communication with other components of the
system are described below.
[0031] In an embodiment, the user of the electronic device 102 can
take an image of the reference location 402 and selects a region of the
14/28
image to indicate the reference point 404 in the electronic device 102.
Alternatively, the reference location 402 can be scanned using the
electronic device 102 and the reference point 404 can be automatically
identified by the electronic device 102. The image described herein can be
stereo image, 3D image, dynamic image, and the like5 .
[0032] In an embodiment, the electronic device 102 can be
configured to determine the geographic coordinates of the reference point
404. As the reference location 104 cannot communicate their geographic
coordinates to the electronic device 102, the electronic device 102 can be
10 configured to communicate with various third-party sources 406 such as for
example, but not limited to, service providers, local portals, geo-databases,
coordinate systems, and the like. For example, the electronic device 102
can be configured to send the stereo image of the reference location 402
including the indication of the reference point 404 to the third-party source
15 406 over the communication network. The third-party source 406 can be
configured to determine the geographic coordinates using the information
present in their database. In an embodiment, the third-party source 406 can
be configured to interact with various other sources or internal and external
databases to determine the geographic coordinates of the reference point
20 404 in the stereo image of the reference location 402.
[0033] The geographic coordinate information of the reference
point can be used to significantly reduce the search space for server side
programs and increase the performance and accuracy (by eliminating many
similar potential matches belonging to other localities in the real-world) of
25 the system. Further, upon determining the geographic coordinates of the
reference point 404, the third-party source 406 can be configured to send
the determined geographic coordinates of the reference point 404 to the
electronic device 102 over the communication network.
15/28
[0034] In an embodiment, upon receiving the geographic
coordinates of the reference point 404, the relative location of the electronic
device with respect to the reference point 404 can be determined. In an
example, the electronic device 102 can be configured to use a stereo vision
technique to determine the relative location of the electronic device 105 2
using the stereo images of the reference location 402. The electronic device
102 can be configured to calculate the relative location of the electronic
device 102 with respect to the chosen reference point 404 in terms of (Dx,
Dy, Dz), where Dx, Dy, and Dz represent the relative displacements along
10 the x, y, and z directions respectively.
[0035] In an embodiment, the stereo-vision technique described
herein can extract the 3D information from digital images. The stereo
vision uses the stereo camera 204 of the electronic device 102 to capture
two images (left and right) simultaneously of the reference location 402. A
15 stereo vision algorithm can be used to compare information about reference
location 402 from the two images. Corresponding pixels can be identified
in the left and right images that can represent the same reference point 404
in both the respective images. In an embodiment, the equation Error!
Reference source not found.used to calculate depth (distance) of the
20 reference point 404 from the stereo camera 204 is
[0036] Further, a depth (distance) map of the reference location 402
can be generated based on the calculated pixels. The electronic device 102
can be configured to use the depth data for the reference location 404 to
25 determine the relative location (Dx, Dy, and Dz) of the electronic device
102. The relative location of the electronic device 102 can be determined in
terms of the CGS or MKS units with respect to the identified reference
point 404 of the reference location 402. The detailed operations performed
16/28
to determine the relative location of the electronic device 102 are described
in conjunction with the FIGS. 2 and 3.
[0037] In an embodiment, the electronic device 102 can be
configured to calculate the geographic coordinates of the electronic device
102 using the geographic coordinates of the reference point 404 and th5 e
relative location of the electronic device 102. The following equation can
be used to calculate the geographic coordinates of the electronic device 102
(Latitude, Longitude, Elevation) Electronic Device = (Latitude, Longitude,
Elevation) Reference point + (DLatitude, DLongitude, DElevation) Electronic Device
10 relative location.
[0038] Further, in an embodiment, the electronic device 102 can be
configured to use the geographic coordinates of the electronic device 102 to
calibrate or recalibrate the IMU. In an embodiment, the electronic device
102 can be configured to use various methods to implement the IMU for
15 estimating the electronic device location information. The various methods
described herein can include for example, but not limited to, extended
kalman filter (EKF), unscented kalman filter (UKF), particle filter based
implementation, and the like. Depending on the method used by the
electronic device 102, the electronic device 102 can be configured to
20 calibrate or recalibrate the IMU using the geographic coordinates of the
electronic device 102.
[0039] FIG. 5 is a diagram illustrates generally an exemplary
reference location 502 whose geographic coordinates can be communicated
to the electronic device 102, according to embodiments as disclosed herein.
25 The reference location 502 described herein may be a well known location
or whose geographic coordinates can be communicated to the electronic
device 102. The FIG. 5 shows the reference location 502 (including a
signboard on a railway bridge) that displays geographic coordinate
information and the electronic device 102 carried by the users travelling in
17/28
cars. The various operations performed by the electronic device 102 in
communication with other components of the system are described below.
[0040] In an embodiment, the user of the electronic device 102 can
take an image of the reference location 502 and selects a region of the
image to indicate the reference point 504 in the electronic device 1025 .
Alternatively, the reference location 502 can be scanned using the
electronic device 102 and the reference point 504 can be automatically
identified by the electronic device 102. The image described herein can be
stereo image, 3D image, dynamic image, and the like.
10 [0041] In an embodiment, the electronic device 102 can be
configured to determine the geographic coordinates of the reference point
504. The reference location 502 can communicate their geographic
coordinates to the electronic device 102. In an embodiment, the electronic
device 102 can be configured to use image processing techniques for
15 extracting the geographic coordinate information from the image of the
reference location 502. In an embodiment, the image processing techniques
described herein can include for example, but not limited to, text
recognition techniques, optical character recognition, grey scale mapping,
texture recognition, or any other image processing technique known in the
20 art. In an embodiment, wireless broadcast methods for broadcasting of the
exact geographic coordinate information of the reference point 504 to the
electronic device 102 can be used. For example, the geographic coordinate
information of the reference point 504 can be broadcasted on a Wi-Fi
network.
25 [0042] In an embodiment, upon determining the geographic
coordinates of the reference point 504, the relative location of the electronic
device with respect to the reference point 504 can be determined. In an
example, the electronic device 102 can be configured to use a stereo vision
technique to determine the relative location of the electronic device 102
18/28
using the stereo images of the reference location 502. The electronic device
102 can be configured to calculate the relative location with respect to the
chosen reference point 504 in terms of (Dx, Dy, Dz), where Dx, Dy, and Dz
represent the relative displacements along the x, y, and z directions
respectively5 .
[0043] In an embodiment, the stereo-vision technique described
herein can extract the 3D information from digital images. The stereo
vision uses the stereo camera 204 of the electronic device 102 to capture
two images (left and right) simultaneously of the reference location 502. A
10 stereo vision algorithm can be used to compare information about reference
location 502 from the two images. Corresponding pixels can be identified
in the left and right images that can represent the same reference point 504
in both the respective images. In an embodiment, the equation Error!
Reference source not found.used to calculate depth (distance) of the
15 reference point 504 from the stereo camera 204 is
[0044] Further, a depth (distance) map of the reference location 502
can be generated based on the calculated pixels. The electronic device 102
can be configured to use the depth data for the reference point 504 to
20 determine the relative location (Dx, Dy, and Dz) of the electronic device
102. The relative location of the electronic device 102 can be determined in
terms of the CGS or MKS units with respect to the identified reference
point 504 of the reference location 502. The detailed operations performed
to determine the relative location of the electronic device 102 are described
25 in conjunction with the FIGS. 2 and 3.
[0045] In an embodiment, the electronic device 102 can be
configured to calculate the geographic coordinates of the electronic device
102 using the geographic coordinates of the reference point 504 and the
19/28
relative location of the electronic device 102. The following equation can
be used to calculate the geographic coordinates of the electronic device 102
(Latitude, Longitude, Elevation) Electronic Device = (Latitude, Longitude,
Elevation) Reference point + (DLatitude, DLongitude, DElevation) Electronic Device
relative location5 .
[0046] Further, in an embodiment, the electronic device 102 can be
configured to use the geographic coordinates of the electronic device 102 to
calibrate or recalibrate the IMU.
[0047] FIG. 6 is a flow diagram illustrating a method 600 for
10 calibrating the IMU of the electronic device 102, according to embodiments
as disclosed herein. In an embodiment, at 602, the method 600 includes
capturing an image of a reference location. In an example, the method 600
allows the user of the electronic device 102 to take an image of the
reference location. In an example, the method 600 includes scanning the
15 reference location using the electronic device 102 to record the stereo
image of the reference location. The image described herein can be stereo
image, 3D image, dynamic image, and the like.
[0048] In an embodiment, at 604, the method 600 includes
identifying the reference point in the stereo image of the reference location.
20 In an embodiment, the method 600 allows the user to select a region of the
image to indicate the reference point. For example, the user can provide a
touch input, gesture input, gaze input, or any other type of input to identify
the reference point in the stereo image of the reference location.
Alternatively, the method 600 allows the electronic device 102 to
25 automatically identify the reference point in the stereo image of the
reference location.
[0049] In an embodiment, at 606, the method 600 includes
determining geographic coordinate of the reference point. In an example,
for the reference location 104 which cannot communicate their geographic
20/28
coordinates to the electronic device 102, the method 600 allows the
electronic device 102 to communicate with various third-party sources to
receive the geographic coordinates of the reference point. The various
third-party sources can include for example, but not limited to, service
providers, local portals, geo-databases, coordinate systems, and the like5 .
The method 600 allows the electronic device 102 to send the stereo image
of the reference location (including the indication of the reference point) to
the third-party source over the communication network. The third-party
source can determine the geographic coordinates using the information
10 present with various internal and external databases. Further, upon
determining the geographic coordinates of the reference point, the thirdparty
source can send the determined geographic coordinates of the
reference point to the electronic device 102 over the communication
network.
15 [0050] In another example, for the reference location which can
communicate their geographic coordinates to the electronic device 102, the
method 600 allows the electronic device 102 to use image processing
techniques for extracting the geographic coordinate information from the
image of the reference location. In yet another example, wireless broadcast
20 methods for broadcasting of the exact geographic coordinate information of
the reference point to the electronic device 102 can be used.
[0051] In an embodiment, at 608, the method 600 includes
determining the relative location of the electronic device 102 with respect
to the reference point. In an example, the method 600 allows the electronic
25 device 102 to use the stereo vision technique or other ranging techniques
(such as LIDAR, SONAR, and the like) to determine the relative location
of the electronic device 102 using the stereo images of the reference
location. The method 600 allows the electronic device 102 to calculate the
relative location with respect to the chosen reference point in terms of (Dx,
21/28
Dy, Dz), where Dx, Dy, and Dz represent the relative displacements along
the x, y, and z directions respectively.
[0052] In an embodiment, the stereo-vision technique described
herein can extract the 3D information from digital images. The stereo
vision uses the stereo camera 204 of the electronic device 102 to captur5 e
two images (left and right) simultaneously of the reference location. A
stereo vision algorithm can be used to compare information about reference
location from the two images. Corresponding pixels can be identified in the
left and right images that can represent the same reference point in both the
10 respective images. In an embodiment, the equation Error! Reference
source not found.used to calculate depth (distance) of the reference point
from the stereo camera 204 is
[0053] Further, the method 600 includes generating a depth
15 (distance) map of the reference location based on the calculated pixels. The
method 600 allows the electronic device 102 to use the depth data for the
reference location to determine the relative location (Dx, Dy, and Dz). The
relative location of the electronic device 102 can be determined in terms of
the CGS or MKS units with respect to the identified reference point of the
20 reference location.
[0054] In an embodiment, at 610, the method 600 includes
calculating the geographic coordinates of the electronic device 102 using
the geographic coordinates of the reference point and the relative location
of the electronic device 102. The following equation can be used to
25 calculate the geographic coordinates of the electronic device 102 (Latitude,
Longitude, Elevation) Electronic Device = (Latitude, Longitude, Elevation)
Reference point + (DLatitude, DLongitude, DElevation) Electronic Device relative location.
22/28
In an embodiment, at 612, the method 600 includes calibrate or recalibrate
the IMU using the geographic coordinates of the electronic device 102.
[0055] The various steps, blocks, operations, and acts described
with respect to the FIGS. 4 through 6 can be performed in sequential order,
in random order, simultaneously, parallel, or a combination thereof5 .
Further, in some embodiments, some of the steps, blocks, operations, and
acts can be omitted, skipped, modified, or added without departing from
scope of the invention. Although the above description is described using
stereoscopic technique but, it is understood that the use of other techniques
10 are not limited thereto.
[0056] FIG. 7 depicts an example computing environment
implementing the application, in accordance with various embodiments of
the present invention. As depicted, the computing environment comprises
at least one processing unit that is equipped with a control unit and an
15 Arithmetic Logic Unit (ALU), a memory, a storage unit, a clock chip,
plurality of networking devices, and a plurality Input output (I/O) devices.
The processing unit is responsible for processing the instructions of the
algorithm. The processing unit receives commands from the control unit in
order to perform its processing. Further, any logical and arithmetic
20 operations involved in the execution of the instructions are computed with
the help of the ALU.
[0057] The overall computing environment can be composed of
multiple homogeneous and/or heterogeneous cores, multiple CPUs of
different kinds, special media and other accelerators. The processing unit is
25 responsible for processing the instructions of the algorithm. The processing
unit receives commands from the control unit in order to perform its
processing. Further, any logical and arithmetic operations involved in the
execution of the instructions are computed with the help of the ALU.
23/28
Further, the plurality of processing units may be located on a single chip or
over multiple chips.
[0058] The algorithm comprising of instructions and codes required
for the implementation are stored in either the memory unit or the storage
or both. At the time of execution, the instructions may be fetched from th5 e
corresponding memory, and/or storage, and/or over communication
network, and executed by the processing unit. The processing unit
synchronizes the operations and executes the instructions based on the
timing signals generated by the clock chip. The embodiments disclosed
10 herein can be implemented through at least one software program running
on at least one hardware device and performing network management
functions to control the elements. The elements shown in the FIGS. 1-7
include various units, blocks, modules, or steps described in relation with
methods, processes, algorithms, or systems of the present invention, which
15 can be implemented using any general purpose processor and any
combination of programming language, application, embedded processor,
and hardware components.
[0059] The foregoing description of the specific embodiments will
so fully reveal the general nature of the embodiments herein that others can,
20 by applying current knowledge, readily modify and/or adapt for various
applications such specific embodiments without departing from the generic
concept, and, therefore, such adaptations and modifications should and are
intended to be comprehended within the meaning and range of equivalents
of the disclosed embodiments. It is to be understood that the phraseology or
25 terminology employed herein is for the purpose of description and not of
limitation. Therefore, while the embodiments herein have been described in
terms of preferred embodiments, those skilled in the art will recognize that
the embodiments herein can be practiced with modification within the spirit
and scope of the embodiments as described herein.
30
24/28
STATEMENT OF CLAIMS
We claim:
1. A method for calibrating an inertial measurement unit (IMU) of an
electronic device, the method comprising:
receiving a reference point of an object in an environment;
determining geographic coordinates of said reference point;
determining relative location of said electronic device
relative to said reference point;
calculating geographic coordinates of said electronic device
using said geographic coordinates of said reference point and said
relative location of said electronic device; and
calibrating said IMU using said geographic coordinates of
said electronic device.
2. The method of claim 1, wherein said method further comprises
recalibrating said IMU using said geographic coordinates of said
electronic device.
3. The method of claim 1, wherein said method further comprises
implementing at least one of stereographic technique, light detection
and ranging (LIDAR) technique, and sound navigation and ranging
(SONAR) technique in said electronic device.
4. The method of claim 1, wherein said relative location of said
electronic device is determined using said at least one of stereo
vision, LIDAR technique, and SONAR technique.
5. The method of claim 1, wherein said method further comprises
receiving at least one image of said environment using said
electronic device.
6. The method of claim 4, wherein said at least one image of said
environment comprises recording a stereoscopic image of said
environment using said electronic device.
25/28
7. The method of claim 4, wherein said at least one image of said
environment comprises a three dimensional image.
8. The method of claim 4, wherein said at least one image of said
environment comprises a dynamic image.
9. The method of claim 1, wherein said method further comprises
identifying said reference point of said object in said at least one
image of said environment.
10. The method of claim 1, wherein said method further comprises
determining said geographic coordinates of said reference point of
said object using at least one of said electronic device, image
processing techniques, wireless communication, and third-party
sources.
11. A system for calibrating an inertial measurement unit (IMU), the
system comprising an electronic device configured to:
receive a reference point of an object in an environment,
determine geographic coordinates of said reference point,
determine relative location of said electronic device relative
to said reference point,
calculate geographic coordinates of said electronic device
using said geographic coordinates of said reference point and said
relative location of said electronic device, and
calibrate said IMU using said geographic coordinates of said
electronic device.
12. The system of claim 11, wherein said electronic device is further
configured to recalibrate said IMU using said geographic
coordinates of said electronic device.
13. The system of claim 11, wherein said electronic device is further
configured to implement at least one of stereographic technique,
26/28
light detection and ranging (LIDAR) technique, and sound
navigation and ranging (SONAR) technique.
14. The system of claim 11, wherein said electronic device is further
configured to implement said IMU.
15. The system of claim 11, wherein said object is configured to
implement said IMU.
16. The system of claim 11, wherein electronic device is further
configured to determine said relative location of said electronic
device using said at least one of stereo vision technique, LIDAR
technique, and SONAR technique.
17. The system of claim 11, wherein electronic device is further
configured to receive at least one image of said environment.
18. The system of claim 11, wherein electronic device is further
configured to implement a stereo camera to capture said at least one
image of said environment.
19. The system of claim 18, wherein stereo camera is configured to
record at least one stereoscopic image of said environment.
20. The system of claim 18, wherein said at least one image of said
environment comprises a three dimensional image.
21. The system of claim 18, wherein said at least one image of said
environment comprises a dynamic image.
22. The system of claim 11, wherein electronic device is further
configured to identify said reference point of said object in said at
least one image of said environment.
27/28
23. The system of claim 11, wherein electronic device is further
configured to determine said geographic coordinates of said reference
point of said object using at least one of said electronic device, image
processing technique, wireless communication, and third-party sources.

Documents

Application Documents

# Name Date
1 835-DEL-2013-PROOF OF ALTERATION [17-01-2024(online)].pdf 2024-01-17
1 Form5.pdf 2013-03-20
2 835-DEL-2013-RELEVANT DOCUMENTS [23-08-2022(online)].pdf 2022-08-23
2 FORM 3.pdf 2013-03-20
3 Drawings.pdf 2013-03-20
3 835-DEL-2013-FORM 4 [20-04-2022(online)].pdf 2022-04-20
4 Disclosure_16_SEL_12_460_Form 2.pdf 2013-03-20
4 835-DEL-2013-IntimationOfGrant07-06-2021.pdf 2021-06-07
5 835-DEL-2013-PatentCertificate07-06-2021.pdf 2021-06-07
5 835-DEL-2013-GPA-(15-04-2013).pdf 2013-04-15
6 835-DEL-2013-FORM-26 [11-10-2019(online)].pdf 2019-10-11
6 835-DEL-2013-Form-1-(15-04-2013).pdf 2013-04-15
7 835-DEL-2013-Correspondence-Others-(15-04-2013).pdf 2013-04-15
7 835-DEL-2013-8(i)-Substitution-Change Of Applicant - Form 6 [10-10-2019(online)].pdf 2019-10-10
8 SEL_New POA_ipmetrix.pdf 2015-04-16
8 835-DEL-2013-ASSIGNMENT DOCUMENTS [10-10-2019(online)].pdf 2019-10-10
9 835-DEL-2013-Amendment Of Application Before Grant - Form 13 [29-05-2018(online)].pdf 2018-05-29
9 FORM 13-change of POA - Attroney.pdf 2015-04-16
10 835-DEL-2013-FER.pdf 2017-12-20
10 835-DEL-2013-FER_SER_REPLY [29-05-2018(online)].pdf 2018-05-29
11 835-DEL-2013-FER.pdf 2017-12-20
11 835-DEL-2013-FER_SER_REPLY [29-05-2018(online)].pdf 2018-05-29
12 835-DEL-2013-Amendment Of Application Before Grant - Form 13 [29-05-2018(online)].pdf 2018-05-29
12 FORM 13-change of POA - Attroney.pdf 2015-04-16
13 835-DEL-2013-ASSIGNMENT DOCUMENTS [10-10-2019(online)].pdf 2019-10-10
13 SEL_New POA_ipmetrix.pdf 2015-04-16
14 835-DEL-2013-8(i)-Substitution-Change Of Applicant - Form 6 [10-10-2019(online)].pdf 2019-10-10
14 835-DEL-2013-Correspondence-Others-(15-04-2013).pdf 2013-04-15
15 835-DEL-2013-Form-1-(15-04-2013).pdf 2013-04-15
15 835-DEL-2013-FORM-26 [11-10-2019(online)].pdf 2019-10-11
16 835-DEL-2013-GPA-(15-04-2013).pdf 2013-04-15
16 835-DEL-2013-PatentCertificate07-06-2021.pdf 2021-06-07
17 835-DEL-2013-IntimationOfGrant07-06-2021.pdf 2021-06-07
17 Disclosure_16_SEL_12_460_Form 2.pdf 2013-03-20
18 Drawings.pdf 2013-03-20
18 835-DEL-2013-FORM 4 [20-04-2022(online)].pdf 2022-04-20
19 FORM 3.pdf 2013-03-20
19 835-DEL-2013-RELEVANT DOCUMENTS [23-08-2022(online)].pdf 2022-08-23
20 Form5.pdf 2013-03-20
20 835-DEL-2013-PROOF OF ALTERATION [17-01-2024(online)].pdf 2024-01-17

Search Strategy

1 Searchstrategy835-del-2013_28-08-2017.pdf

ERegister / Renewals

3rd: 01 Sep 2021

From 20/03/2015 - To 20/03/2016

4th: 01 Sep 2021

From 20/03/2016 - To 20/03/2017

5th: 01 Sep 2021

From 20/03/2017 - To 20/03/2018

6th: 01 Sep 2021

From 20/03/2018 - To 20/03/2019

7th: 01 Sep 2021

From 20/03/2019 - To 20/03/2020

8th: 01 Sep 2021

From 20/03/2020 - To 20/03/2021

9th: 01 Sep 2021

From 20/03/2021 - To 20/03/2022

10th: 20 Apr 2022

From 20/03/2022 - To 20/03/2023

11th: 17 Mar 2023

From 20/03/2023 - To 20/03/2024

12th: 18 Mar 2024

From 20/03/2024 - To 20/03/2025

13th: 17 Mar 2025

From 20/03/2025 - To 20/03/2026