Sign In to Follow Application
View All Documents & Correspondence

A Method And A Guided Imaging Unit For Guiding A User To Capture An Image

Abstract: Embodiments of the present disclosure provide a method for guiding a user to capture an image of a target object using an image capturing device. In an embodiment, the method of the present disclosure comprises determining a bounding area for image to be captured and capturing at least one frame of the image upon detecting image to be inside the bounding area. Then, the target object in captured at least one frame is segmented by separating the target object from the rest of the image. Further, at least one of symmetry and self-similarity of the segmented target object is determined. In addition, at least one image parameter is determined by a sensor. The method then provides inputs for guiding user to capture a final image of the target object, based on at least one of determined symmetry, self-similarity, and at least one image parameter. Fig.3A and 3B

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
11 March 2014
Publication Number
12/2014
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ipo@knspartners.com
Parent Application
Patent Number
Legal Status
Grant Date
2023-06-27
Renewal Date

Applicants

WIPRO LIMITED
Doddakannelli, Sarjapur Road, Bangalore 560035, Karnataka, India.

Inventors

1. RAMACHANDRA BUDIHAL
#540, 6th main, 11th cross, Girinagar, 2nd Phase, Bangalore 560085, Karnataka

Specification

CLIAMS:We claim:

1. A computer implemented method for guiding a user to capture an image of a target object using an image capturing device, comprising:
determining a bounding area for the image to be captured;
capturing at least one frame of the image upon detecting the image to be inside the bounding area;
segmenting the target object in the captured at least one frame by separating the target object from the rest of the image;
determining at least one of symmetry and self-similarity of the segmented target object;
determining at least one image parameter by a sensor; and
providing one or more inputs, based on at least one of the determined symmetry, the determined self-similarity, and the at least one image parameter of the image, for guiding the user to capture a final image of the target object.

2. The method as claimed in claim 1, wherein the determining of the bounding area for the image to be captured comprises:
prompting the user to provide attributes of the image; and
determining the bounding area based on the attributes.

3. The method as claimed in claim 1, the determining of the bounding area for the image to be captured comprises:
providing a plurality of bounding areas to the user for selection; and
determining the bounding area based on the selection of user.

4. The method as claimed in claim 1, wherein the determining of the bounding area for the image to be captured comprises:
prompting the user to manually draw the bounding area around the image.

5. The method as claimed in claim 1 further comprising providing a direction indicator indicating direction for moving the image capturing device.

6. The method as claimed in claim 1 further comprising storing data related to at least one of the final image and a part of the final image provided by the user.

7. The method as claimed in claim 6, wherein the data is selected from at least one of voice, video, audio, hyperlink image, barcode, Quick Response (QR) code, reference images, digital hand sketches, markings and software program.

8. The method as claimed in claim 1, wherein the one or more inputs are provided in a format comprising text, image, audio, video and vibration in different patterns and frequencies.

9. The method as claimed in claim 1, wherein the one or more aspects comprise orientation, direction, slope angles, amount of light and distance of the image capturing device.

10. The method as claimed in claim 1 further comprising watermarking and encrypting the final image with a unique identifier.

11. The method as claimed in claim 1, wherein the symmetry of the segmented object is determined by identifying at least one of vertical axis, horizontal axis and complementary side of the segmented object.

12. A guided imaging unit for guiding a user to capture an image of a target object, the guided imaging unit comprising:
at least one processor;
a sensor for sensing at least one image parameter;
a memory storing instructions executable by the at least one processor, wherein on execution of the instructions, the at least one processor:
determines a bounding area for the image to be captured;
captures at least one frame of the image upon detecting the image to be inside the bounding area;
segments the target object in the captured at least one frame by separating the target object from the rest of the image;
determines at least one of symmetry and self-similarity of the segmented target object;
determines at least one image parameter using the sensor; and
provides one or more inputs, based on at least one of the determined symmetry, the determined self-similarity, and the at least one image parameter, for guiding the user to capture the final image.

13. The guided imaging unit as claimed in claim 12 is located in at least one of image capturing device and a server communicably connected to the image capturing device.

14. The guided imaging unit as claimed in claim 12, wherein the memory stores the captured images and the plurality of bounding areas.

15. The guided imaging unit as claimed in claim 12, wherein the sensor is selected from a group comprising gyroscope, compass, 3-axis accelerometer, inclinometer, light meter, Infrared ranger and ultrasound ranger.

16. The guided imaging unit as claimed in claim 12, wherein the at least one image parameter comprise orientation, direction, slope angles, amount of light and distance of the image capturing device.

17. A non-transitory computer readable medium including instructions stored thereon that when processed by at least one processor cause a guided imaging unit to perform acts of:
determining a bounding area for the image to be captured;
capturing at least one frame of the image upon detecting the image to be inside the bounding area;
segmenting the target object in the captured at least one frame by separating the target object from the rest of the image;
determining at least one of symmetry and self-similarity of the segmented target object;
determining at least one image parameter by a sensor; and
providing one or more inputs, based on at least one of the determined symmetry, the determined self-similarity, and the at least one image parameter, for guiding the user to capture a final image of the target object.

18. The medium as claimed in claim 17, wherein the instructions further cause the at least one processor to perform operations comprising providing a direction indicator indicating direction for moving the image capturing device.

19. The medium as claimed in claim 17, wherein the instructions further cause the at least one processor to perform operations comprising storing data related to at least one of the final image and a part of the final image provided by the user.

20. The medium as claimed in claim 17, wherein the instructions further cause the at least one processor to perform operations comprising watermarking and encrypting the final image with a unique identifier.

Dated this 11th day of March, 2014
R. RAMYA RAO
OF K & S PARTNERS
AGENT FOR THE APPLICANT
,TagSPECI:TECHNICAL FIELD

The present disclosure relates to image processing. In particular, embodiments of present disclosure include a method and a guided imaging unit for guiding a user to capture an image.

Documents

Application Documents

# Name Date
1 1258-CHE-2014 FORM-9 11-03-2014.pdf 2014-03-11
1 1258-CHE-2014-IntimationOfGrant27-06-2023.pdf 2023-06-27
2 1258-CHE-2014 FORM-18 11-03-2014.pdf 2014-03-11
2 1258-CHE-2014-PatentCertificate27-06-2023.pdf 2023-06-27
3 IP26272-spec.pdf 2014-03-12
3 1258-CHE-2014-FORM-26 [24-02-2023(online)].pdf 2023-02-24
4 IP26272-fig.pdf 2014-03-12
4 1258-CHE-2014-FORM 3 [21-02-2023(online)].pdf 2023-02-21
5 FORM 5.pdf 2014-03-12
5 1258-CHE-2014-PETITION UNDER RULE 137 [21-02-2023(online)].pdf 2023-02-21
6 FORM 3.pdf 2014-03-12
6 1258-CHE-2014-Written submissions and relevant documents [21-02-2023(online)].pdf 2023-02-21
7 Certified copy request.pdf 2014-03-12
7 1258-CHE-2014-AMENDED DOCUMENTS [30-01-2023(online)].pdf 2023-01-30
8 abstract1258-CHE-2014.jpg 2014-03-15
8 1258-CHE-2014-Correspondence to notify the Controller [30-01-2023(online)].pdf 2023-01-30
9 1258-CHE-2014 FORM-1 10-06-2014.pdf 2014-06-10
9 1258-CHE-2014-FORM 13 [30-01-2023(online)].pdf 2023-01-30
10 1258-CHE-2014 CORRESPONDENCE OTHERS 10-06-2014.pdf 2014-06-10
10 1258-CHE-2014-POA [30-01-2023(online)].pdf 2023-01-30
11 1258-CHE-2014-Request For Certified Copy-Online(16-02-2015).pdf 2015-02-16
11 1258-CHE-2014-US(14)-HearingNotice-(HearingDate-06-02-2023).pdf 2023-01-17
12 1258-CHE-2014-ABSTRACT [30-12-2019(online)].pdf 2019-12-30
12 1258CHE2014_Certifiedcoyrequest.pdf ONLINE 2015-02-18
13 1258-CHE-2014-CLAIMS [30-12-2019(online)].pdf 2019-12-30
13 1258CHE2014_Certifiedcoyrequest.pdf 2015-03-13
14 1258-CHE-2014-DRAWING [30-12-2019(online)].pdf 2019-12-30
14 1258-CHE-2014-FER.pdf 2019-06-28
15 1258-CHE-2014-FER_SER_REPLY [30-12-2019(online)].pdf 2019-12-30
15 1258-CHE-2014-FORM-26 [30-12-2019(online)].pdf 2019-12-30
16 1258-CHE-2014-FORM 3 [30-12-2019(online)].pdf 2019-12-30
17 1258-CHE-2014-FORM-26 [30-12-2019(online)].pdf 2019-12-30
17 1258-CHE-2014-FER_SER_REPLY [30-12-2019(online)].pdf 2019-12-30
18 1258-CHE-2014-FER.pdf 2019-06-28
18 1258-CHE-2014-DRAWING [30-12-2019(online)].pdf 2019-12-30
19 1258-CHE-2014-CLAIMS [30-12-2019(online)].pdf 2019-12-30
19 1258CHE2014_Certifiedcoyrequest.pdf 2015-03-13
20 1258-CHE-2014-ABSTRACT [30-12-2019(online)].pdf 2019-12-30
20 1258CHE2014_Certifiedcoyrequest.pdf ONLINE 2015-02-18
21 1258-CHE-2014-Request For Certified Copy-Online(16-02-2015).pdf 2015-02-16
21 1258-CHE-2014-US(14)-HearingNotice-(HearingDate-06-02-2023).pdf 2023-01-17
22 1258-CHE-2014 CORRESPONDENCE OTHERS 10-06-2014.pdf 2014-06-10
22 1258-CHE-2014-POA [30-01-2023(online)].pdf 2023-01-30
23 1258-CHE-2014 FORM-1 10-06-2014.pdf 2014-06-10
23 1258-CHE-2014-FORM 13 [30-01-2023(online)].pdf 2023-01-30
24 abstract1258-CHE-2014.jpg 2014-03-15
24 1258-CHE-2014-Correspondence to notify the Controller [30-01-2023(online)].pdf 2023-01-30
25 Certified copy request.pdf 2014-03-12
25 1258-CHE-2014-AMENDED DOCUMENTS [30-01-2023(online)].pdf 2023-01-30
26 FORM 3.pdf 2014-03-12
26 1258-CHE-2014-Written submissions and relevant documents [21-02-2023(online)].pdf 2023-02-21
27 FORM 5.pdf 2014-03-12
27 1258-CHE-2014-PETITION UNDER RULE 137 [21-02-2023(online)].pdf 2023-02-21
28 IP26272-fig.pdf 2014-03-12
28 1258-CHE-2014-FORM 3 [21-02-2023(online)].pdf 2023-02-21
29 IP26272-spec.pdf 2014-03-12
29 1258-CHE-2014-FORM-26 [24-02-2023(online)].pdf 2023-02-24
30 1258-CHE-2014-PatentCertificate27-06-2023.pdf 2023-06-27
30 1258-CHE-2014 FORM-18 11-03-2014.pdf 2014-03-11
31 1258-CHE-2014 FORM-9 11-03-2014.pdf 2014-03-11
31 1258-CHE-2014-IntimationOfGrant27-06-2023.pdf 2023-06-27

Search Strategy

1 searchstrat_27-06-2019.pdf

ERegister / Renewals

3rd: 11 Sep 2023

From 11/03/2016 - To 11/03/2017

4th: 11 Sep 2023

From 11/03/2017 - To 11/03/2018

5th: 11 Sep 2023

From 11/03/2018 - To 11/03/2019

6th: 11 Sep 2023

From 11/03/2019 - To 11/03/2020

7th: 11 Sep 2023

From 11/03/2020 - To 11/03/2021

8th: 11 Sep 2023

From 11/03/2021 - To 11/03/2022

9th: 11 Sep 2023

From 11/03/2022 - To 11/03/2023

10th: 11 Sep 2023

From 11/03/2023 - To 11/03/2024

11th: 07 Mar 2024

From 11/03/2024 - To 11/03/2025

12th: 07 Mar 2025

From 11/03/2025 - To 11/03/2026