Abstract: ABSTRACT The embodiments herein relate to augmented reality applications and, more particularly, to tracking object’s orientation and position using external sensor in augmented reality applications. A sensor unit is connected/attached to an object. Further, a signal is sent from a source unit as input to the sensor unit. The sensor unit comprises a plurality of sensors and an orientation sensor which receives the input signal. The sensors in the sensor unit receive input signal at different time instances. The time information and data from the orientation sensor are sent as response to the source unit. The source unit, based on the response data, identifies position and orientation of the sensor unit with respect to the source unit. The identified position and orientation information may be used to identify location of any component of the object i.e. object of interest with respect to the position of the sensor unit. FIG. 1
CLIAMS:CLAIMS
We claim:
1. A method for tracking an object’s orientation and position using external sensor in augmented reality applications, said method comprises:
connecting a sensor unit to an object;
synchronizing clock between said sensor unit and a source unit;
sending an input signal from said source unit to said sensor unit;
sending a response from said sensor unit to said source unit; and
identifying position and orientation of said sensor unit based on said received response by said source unit.
2. The method as in claim 1, wherein said sensor unit is an array of sensors.
3. The method as in claim 1, wherein said response further comprises a plurality of sensor data and an orientation data.
4. The method as in claim 3, wherein said plurality of sensor data further comprises time information from a plurality of sensors in said sensor unit.
5. The method as in claim 4, wherein said time information is different for each of said plurality of sensors in said sensor unit.
6. The method as in claim 1, wherein said position and orientation of said sensor unit is identified with respect to said source unit.
7. A system for tracking an object’s orientation and position using external sensor in augmented reality applications, said system comprises:
a sensor unit connected to said object;
a source unit;
said system configured to synchronize clock between said sensor unit and said source unit;
said sensor unit configured to receive input signal from said source unit;
said sensor unit configured to send response to said source unit; and
said source unit configured to identify position of said sensor unit based on said received response.
8. The system as in claim 7, wherein said sensor unit is an array of sensors.
9. The system as in claim 7, wherein said sensor unit is configured to send a plurality of sensor data and an orientation data as response to said source unit.
10. The system as in claim 9, wherein said sensor unit is configured to send time information from a plurality of sensors in said sensor unit as said sensor data.
11. The system as in claim 7 is configured to identify position and orientation of said sensor unit with respect to said source unit.
Dated: 29-05-2013
Signature:
Vikram Pratap Singh Thakur
Patent Agent
,TagSPECI:FORM 2
The Patent Act 1970
(39 of 1970)
&
The Patent Rules, 2005
COMPLETE SPECIFICATION
(SEE SECTION 10 AND RULE 13)
TITLE OF THE INVENTION
“TRACKING POSITION AND ORIENTATION USING EXTERNAL SENSOR FOR AUGMENTED REALITY APPLICATIONS”
APPLICANTS:
Name : HCL Technologies Ltd
Nationality : Indian
Address : HCL Technologies Ltd., 50-53 Greams
Road,Chennai – 600006, Tamil Nadu, India
The following specification particularly describes and ascertains the nature of this invention and the manner in which it is to be performed:-
FIELD OF INVENTION
[001] The embodiments herein relate to augmented reality applications and, more particularly, to tracking object’s orientation and position using external sensor in augmented reality applications.
BACKGROUND OF INVENTION
[002] Augmented Reality (AR) is live, direct or indirect view of real-world physical objects whose elements are augmented with computer graphics, in which user could not easily differentiate which is real and virtual. Augmented reality systems are used in various applications related to gaming, art, commerce, navigation, military and so on. There have been various technologies in augmented reality based applications for identification and tracking the user’s point of interest so as to overlay digital and virtual content on to his view. Amongst all, most widely used is vision based object recognition in which continuous live video is feed from the user’s camera device. Various algorithms and matching techniques are used to compare each frame from the video to identify and track the object position. Once the object is identified based on various calculations, graphical virtual content is rendered on top of the object.
[003] However, this system has certain disadvantages. One disadvantage of the existing vision based AR systems is that recognizing the object from each frame is difficult and time consuming and has low efficiency under low light conditions. To solve this recognition problem Augmented Reality (AR) marker has been used. AR marker is 2D image pattern which is easily recognizable and traceable. These markers are associated to a real world physical object by positioning the marker on a known location of the object to be tracked. The camera device upon identifying the marker using the video feed from the camera can then render the virtual content relative to it. The problem of using AR markers in augmented reality application is, the marker has to be in the camera field of view all the time which is not possible to cover wider object or multi-faced 3D object.
[004] There is a need for a system in augmented reality applications which eliminates problem such as marker has to be in camera’s field of view all the time and recognition of object in bad or low light conditions and more reliable object recognition.
SUMMARY
[005] In view of the foregoing, an embodiment herein provides a method for tracking an object’s orientation and position using external sensor in augmented reality applications, the method comprises connecting a sensor unit to an object; synchronizing clock between the sensor unit and a source unit; sending an input signal from the source unit to the sensor unit; sending a response from the sensor unit to the source unit; and identifying position and orientation of the sensor unit based on the received response by the source unit.
[006] Embodiments further disclose a system for tracking an object’s orientation and position using external sensor in augmented reality applications, the system comprises a sensor unit connected to the object; a source unit; the system configured to synchronize clock between the sensor unit and the source unit; the sensor unit configured to receive input signal from the source unit; the sensor unit configured to send response to the source unit; and the source unit configured to identify position of the sensor unit based on the received response.
[007] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings.
BRIEF DESCRIPTION OF THE FIGURES
[008] The embodiments herein will be better understood from the following detailed description with reference to the drawings, in which:
[009] FIG. 1 illustrates a block diagram of a position and orientation tracking system, as disclosed in the embodiments herein;
[0010] FIG. 2 illustrates a block diagram which shows various components of the sensor unit, as disclosed in the embodiments herein;
[0011] FIG. 3 illustrates a block diagram that shows various components of the source unit, as disclosed in the embodiments herein;
[0012] FIG. 4 is a flow diagram which shows various steps involved in the process of tracking the object of interest using the position and orientation tracking system, as disclosed in the embodiments herein; and
[0013] FIGS. 5A, 5B, and 5C illustrate example implementation of the position and orientation tracking system, as disclosed in the embodiments herein.
DETAILED DESCRIPTION OF THE INVENTION
[0014] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[0015] The embodiments herein disclose a position and orientation tracking system to track the user's object of interest using external sensor in augmented reality applications. Referring now to the drawings, and more particularly to FIG. 1 through 5, where similar reference characters denote corresponding features consistently throughout the figures, there are shown embodiments.
[0016] FIG. 1 illustrates a block diagram of a position and orientation tracking system, as disclosed in the embodiments herein. The position and orientation tracking system 100, comprises of a sensor unit 101 and source unit 102. The source unit 102 and sensor unit 101 can communicate with each other via communication channel. The communication channel may be any known wired or wireless channel. The source unit may be mobile phone, goggle or any other special gear which has camera to view the object of interest, sufficient processing power to render and display for displaying the virtual content to user. User’s object of interest is tracked by placing the sensor unit 101 on object of interest, for example the object of interest may be a device such as a printer. Using the position and orientation tracking system, the user can track the object of interest from certain distance and may obtain and view configuration details of the object indicating position of various modules/components of the object of interest. For example, if the object of interest is a printer, the user may identify location of trays or any other components of the printer using augmented reality application, once, the location and orientation data of the sensor unit 101 is identified. Source unit 101 transmits a signal pulse to sensor unit 102 which is placed on/is connected to the object being tracked, and sensor unit 102 receives signal pulse and transmits sensed data back to source unit 102 as a packaged data as response. Further, the source unit fetches and processes the received data to track position and orientation of the sensor unit 101. This data may be then fed into augmented reality applications which maps virtual content on the real world object which in turn is displayed to the user. For example, virtual content such as where is tray of the printer and tray information and any other details can be displayed on source unit’s display over the real world object such as tray of the printer.
[0017] FIG. 2 illustrates a block diagram which shows various components of the sensor unit, as disclosed in the embodiments herein. The sensor unit 101 further comprises a wireless data transceiver 201, a sensor array 202, an orientation sensor 203, a data collection module 204 and a timer 205. The wireless data transceiver 201 is used to establish a communication channel between source unit 102 and sensor unit 101 and transmit the data from data collection module 204 to source unit 102. The wireless data transceiver 201 may use any suitable technology such as Bluetooth, Wi-Fi or any other short range wireless communication means to establish connection with the source unit 102.. The data collection module 204 collects the data from sensor array 202 and data from orientation sensors 203. The Sensor array 202 further comprises plurality of sensors which may be arranged in a specific order. For example sensors in sensor array 202 may be arranged into multiple rows and columns which are depicted in FIG. 5a. The sensors in the sensor array 202 may be any kind of known sensors such as RF signal sensor, ultra sound sensor and so on. The sensors used in the sensor array 202 may be selected depending on the type of signals being transmitted by the source unit 102. The sensors in sensor array 202 receive signal source unit 102. The received signal is sensed by plurality of sensors in the sensor array 202 and the sensed data is collected by data collection module 204. Orientation sensors 203 further comprises of compass and gyroscope to identify the direction, tilt, pitch and roll of the sensor unit 101 with respect to the source unit 102. Orientation sensors 203 detect orientation of the sensor unit 101. The data from orientation sensors are also collected by data collection module 204. In addition to orientation and sensor data, time-stamp data is also collected by data collection module 204. The data values from the orientation sensors 203 remain same, when the object being tracked is stationary. The values changes as soon as the object being tracked is moved or tilted. The data collection module 204 may then package the data received from the sensor array 202 and the orientation sensor 203 and provide to the wireless data transceiver 201. The wireless data transceiver 201 may then transmit the data to the source unit 102.
[0018] FIG. 3 illustrates a block diagram that shows various components of the source unit, as disclosed in the embodiments herein. The source unit 102 further comprises a wireless data transceiver 301, a signal pulse generator 302, a plurality of orientation sensors 303 and a position, orientation identifier 304 and a timer 305. The wireless data transceiver 301 may be Bluetooth, Wi-Fi or any known short range communication channel. The wireless data transceiver 302 is used to establish a communication channel between source unit 102 and the wireless data transceiver 201 in the sensor unit 101. The source unit 102 and sensor unit 101 may be configured initially by synchronizing timer module of the sensor unit 101 with the timer module of the source unit. The signal pulse generator 302 transmits signal pulse from source unit 102 which can be sensed by the sensor array 202 in sensor unit 101. The signal can be of any radio signal or sound signal or magnetic field or combination of the above, provided respective sensors are used in the sensor array 202. Orientation sensors 303 sense the orientation of the source unit 102. The position and orientation identifier 304 identifies the position and orientation of the sensor unit 101 with respect to source unit 102 by processing data received from the sensor unit 101.
[0019] FIG. 4 is a flow diagram which describes the various steps involved in the process of tracking the object of interest using the position and orientation tracking system, as disclosed in the embodiments herein. Using the position and orientation tracking system, the user can track the object of interest from certain distance and may obtain and view configuration details of the object indicating position of various modules/components of the object of interest. For example, if the object of interest is a printer, the user may track location of trays or any other components of the printer using the position and orientation tracking system. To track a specific object, the sensor unit 101 is connected to/attached to the object that has to be tracked. The sensor unit 101 can communicate with the source unit 102 using a suitable wireless communication system such as Bluetooth, Wi-Fi and so on. Initially, the timer 305 in source unit 102 is synchronized (402) with the timer 205 in sensor unit 101. Further, when the user intends to track the objet, he/she may send input signal pulse from the source unit 102 using the signal pulse generator 302 to the sensor unit 101.
[0020] The input signal pulse from the source unit is received (404) by the sensor array 202 in the sensor unit 101. In a preferred embodiment, the strength of input signal received by different sensors in the sensor array 202 may vary depending on parameters such as the distance of source unit 102 from each of the sensors in the sensor array 202, direction/angle formed between the source unit 102, sensors in the sensor array 202 and so on. The sensors in the sensor array 202 sense even minute change in signal strength. Data from all the sensors in the sensor array 202 and orientation sensors 203 data are collected by data collection module 204 and packaged and transmitted (406) back to position and orientation identifier 304 in source unit 102 for further processing. The orientation sensor data includes data such as timestamp information. The position and orientation identifier 304 process the received data from sensor unit 101 and data available from orientation sensors 303 and calculates the position and orientation of the sensor unit 101 relative to source unit 102. The source 102 may be pre-loaded with augmented reality application and specification data of object of interest to be tracked. So as to identify a specific point of interest i.e. a specific component of the object being tracked, the measured position and orientation information may be compared with pre-configured specification data of the object of interest. The specification data may refer to system configurations and component information related to the object. The identified position and orientation data is fed into augmented reality application which calculates and renders virtual content. The rendered virtual content is mapped with the real world object and is displayed on the display of the source unit 102. The various actions in method 400 may be performed in the order presented, in a different order or simultaneously. Further, in some embodiments, some actions listed in FIG. 4 may be omitted.
[0021] FIGS. 5A, 5B, and 5C illustrate example implementation of the position and orientation tracking system, as disclosed in the embodiments herein. In one embodiment, the arrangement of plurality of sensors in sensor array 202 of senor unit 101 is shown in FIG. 5a. The arrangement of sensors in the sensor array 202 may be in any possible mode such that the input signals may be fetched easily. The sensors are arranged in three rows and three columns in one embodiment. It may be arranged in any number of rows and columns or in any other shape and form. The sensors can be any kind of sensors based on the type of signal pulse used in the sensor unit 102. The sensors can be for example, RF sensor, ultrasonic sensor, etc., FIG. 5b illustrates a method and system of tracking the sensor unit in one embodiment. In one embodiment, the source unit 102 can be mobile device with a camera. However, the source unit 102 may be goggle or any special gear which has a camera to view the object of interest, processing power to render virtual content and display for displaying the virtual content to the user. The sensor unit 101 is receiving signal which is transmitted by the source unit 102. The sensors in the sensor unit 101 are denoted such as A, B, C, etc to denote the position of the sensors. The signal strength and time at which signal is received by each sensors in the senor unit 101 varies based on the direction and position of the source unit 102. For example, in Fig. 5b(1), sensor denoted as G receives signal pulse first from the source unit 102 and signal strength is more compared to other sensors in the sensor unit 101. In another example as illustrated in Fig. 5b(2), sensor denoted as C which receives signal pulse first and strength of the signal is more compared to other sensors in the sensor unit 101. These variations in signal reception and signal strength by each sensor in the senor unit 101 is used to identify the exact location of the sensor unit 101 with respect to source unit 102.
[0022] One example implementation of this system is illustrated Fig. 5c. Assume that the user’s object of interest is to identify the tray (701, 702) in the printer 700. The sensor unit 101 is placed on any known location the printer 700. The source device 102 can be any mobile device such as mobile phone, personal computer or any such device with camera and has the display and sufficient processing power to render the virtual content. The mobile device is pre-loaded with augmented reality application to display the virtual content on real world object. The printer specification information of printer 700 is pre-loaded to the augmented reality application in the mobile device. The timer of the wireless data transceiver of the source unit and sensor unit is synchronized. The signal pulse generator 302 may be a simple speaker in the mobile device which transmits the signal pulse. The sensor unit 102 receives signal pulse from source unit 102. The data collection module in the sensor unit 101 collects data from sensors in the sensor array and data from orientation sensors in the sensor unit 101 including time-stamp data and the collected data is packaged and transmitted back to source unit 102. The received data is processed using complex mathematical equation by position and orientation identifier in the source unit 102 and the position and orientation data of the sensor unit 101 is identified. The identified data is fed into augmented reality application in which the further process is done and virtual content is rendered and mapped with real world object which is tray of the printer. The tray information such as Tray1 and Tray 2 is displayed on the display of the mobile device over the real object.
[0023] The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the network elements. The network elements shown in Fig. 1 include blocks which can be at least one of a hardware device, or a combination of hardware device and software module.
[0024] The embodiment disclosed herein specifies a system for tracking position and orientation an object in augmented reality applications. The mechanism allows tracking position and orientation an object in augmented reality applications, providing a system thereof. Therefore, it is understood that the scope of the protection is extended to such a program and in addition to a computer readable means having a message therein, such computer readable storage means contain program code means for implementation of one or more steps of the method, when the program runs on a server or mobile device or any suitable programmable device. The method is implemented in a preferred embodiment through or together with a software program written in e.g. Very high speed integrated circuit Hardware Description Language (VHDL) another programming language, or implemented by one or more VHDL or several software modules being executed on at least one hardware device. The hardware device can be any kind of device which can be programmed including e.g. any kind of computer like a server or a personal computer, or the like, or any combination thereof, e.g. one processor and two FPGAs. The device may also include means which could be e.g. hardware means like e.g. an ASIC, or a combination of hardware and software means, e.g. an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. Thus, the means are at least one hardware means and/or at least one software means. The method embodiments described herein could be implemented in pure hardware or partly in hardware and partly in software. The device may also include only software means. Alternatively, the invention may be implemented on different hardware devices, e.g. using a plurality of CPUs.
[0025] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the claims as described herein.
| # | Name | Date |
|---|---|---|
| 1 | 2368-CHE-2013 FORM-9 30-05-2013.pdf | 2013-05-30 |
| 1 | 2368-CHE-2013-AbandonedLetter.pdf | 2019-11-04 |
| 2 | 2368-CHE-2013-FER.pdf | 2019-04-30 |
| 2 | 2368-CHE-2013 FORM-18 30-05-2013.pdf | 2013-05-30 |
| 3 | Form 5.pdf | 2013-05-31 |
| 3 | 2368-CHE-2013 CORRESPONDENCE OTHERS 17-10-2013.pdf | 2013-10-17 |
| 4 | FORM 3.pdf | 2013-05-31 |
| 4 | 2368-CHE-2013 FORM-1 17-10-2013.pdf | 2013-10-17 |
| 5 | 2368-CHE-2013 POWER OF ATTORNEY 17-10-2013.pdf | 2013-10-17 |
| 5 | Form 2.pdf | 2013-05-31 |
| 6 | abstract2368-CHE-2013.jpg | 2013-06-10 |
| 6 | drawings.pdf | 2013-05-31 |
| 7 | abstract2368-CHE-2013.jpg | 2013-06-10 |
| 7 | drawings.pdf | 2013-05-31 |
| 8 | 2368-CHE-2013 POWER OF ATTORNEY 17-10-2013.pdf | 2013-10-17 |
| 8 | Form 2.pdf | 2013-05-31 |
| 9 | 2368-CHE-2013 FORM-1 17-10-2013.pdf | 2013-10-17 |
| 9 | FORM 3.pdf | 2013-05-31 |
| 10 | Form 5.pdf | 2013-05-31 |
| 10 | 2368-CHE-2013 CORRESPONDENCE OTHERS 17-10-2013.pdf | 2013-10-17 |
| 11 | 2368-CHE-2013-FER.pdf | 2019-04-30 |
| 11 | 2368-CHE-2013 FORM-18 30-05-2013.pdf | 2013-05-30 |
| 12 | 2368-CHE-2013-AbandonedLetter.pdf | 2019-11-04 |
| 12 | 2368-CHE-2013 FORM-9 30-05-2013.pdf | 2013-05-30 |
| 1 | 2368_29-04-2019.pdf |