Abstract: A system (100) for enabling collaborative environment between multiple devices using 6 Degree of Freedom (DOF) tracking and Internet of Things (IoT) comprises a centralized router (102) having a 6DoF tracking unit (104); a Streaming unit (112); an audio unit (108); a wireless network module (106) to establish a wireless communication network (116); a processing module (110); and a power module (114) to provide electrical power to the centralized router (102). The system (100) further includes one or more devices (120) selected from a first set of devices (1202) and/or a second set of devices (1204). The one or more devices (120) are coupled with respective one or more 6DoF trackers (118), wherein each 6DoF tracker (118) is tracked by the 6DoF tracking unit (104) of the centralized router (102). Also, the second set of devices (1204) are connected with the centralized router (102) via a wireless communication network (116). [FIGURE 1]
DESC:FIELD OF THE INVENTION
Embodiments of the present invention generally relate to mixed reality-based technologies & home automation and more particularly to systems and methods for enabling collaborative environment between multiple devices using 6 Degree of Freedom (DoF) tracking and Internet of things (IoT).
BACKGROUND OF THE INVENTION
Now-a-days human beings are surrounded by smart devices everywhere. Most of the people in the world have become dependent on the smart devices such as smart phones, watches, smart appliances etc. With further advancements in technology and easy availability of internet, the smart appliances were interconnected together so as enable their remote operation and easy operation via a single mobile device. Suddenly, the world witnessed a growth in home automation solutions. But the problem with such solutions is that it only operates with smart devices, and collaborative environment thus created never involved any other devices which do not have internet connection. Many technologies got increasingly popular such as artificial intelligence, machine learning, virtual/mixed/augmented reality systems etc. but at present none can be used to provide a truly collaborative system & immersive experience where everything can be incorporated in a single system. Additionally, the presently available automation systems are only limited to the use of voice commands and use of smart phones for controlling various operations. Moreover, none of them support object tracking. Further, there are no home automation systems available that include AR/VR/MR devices or enable their collaboration with each other or other smart or non-smart (without connectivity) devices.
Additionally, existing systems for collaborative MR content visualization, only offer options for multiple HMDs to collaborate in a common mixed/augmented reality space but do not offer any home automation capabilities. Additionally, these systems offer limited tracking capability in terms of degree of freedom that too with occlusion problems in multi-user tracking due to optical based approach using existing technologies. Furthermore, the presently available headsets offering collaborative capabilities are so expensive that it is not economical for individuals and/or educational institutes to buy these headsets.
Therefore, there is a need in the art for a system and a method for enabling collaborative environment between multiple devices using 6 Degree of Freedom (DoF) tracking and Internet of things (IoT), that do not suffer from above mentioned problems or atleast provide a viable and cost-effective alternative.
OBJECT OF THE INVENTION
An object of the present invention is to provide a system and a method for enabling collaborative environment between multiple devices using 6 Degree of Freedom (DoF) tracking and Internet of things (IoT).
Another object of the present invention is to provide a remote-controlled smart home automation system with 6 DoF streaming module capable of streaming position and orientation data to any object not limited to Mixed Reality, Augmented Reality and Virtual Reality Head Mounted devices that is attached with 6 DoF receiver, and a capability to control household appliances with the help of sensors and computing unit connected to the Internet .
Yet another object of the present invention is to generate a collaborative environment of including the smart devices, household appliances and other household objects and enable their interaction using internet and 6DoF tracking in real-time.
Yet another object of the present invention is to enable automation of remote operations of devices such as kitchen appliances, electrical and lighting systems, display and music systems via HMDs, voice commands and mobile applications.
Yet another object of the present invention is to provide a centralized router capable of powering any low-cost MR/AR/VR device/controller with 6DOF pose tracking and immersive high-quality MR content rendering capabilities over collaborative and multi-user modes.
Yet another object of the present invention is to enable remote operation of multiple smart phone functionalities such as calling, messaging etc. using a centralized router and/or connected HMDs.
Yet another object of the present invention is to enable streaming of video content from a PC, laptop or smartphone to an AR, VR and MR based HMD and vice versa.
Yet another object of the invention is to create a 4D image of the surroundings including the one or more devices (120) and the people within using RF-Radar based tracking and to detect a number of people in the surroundings, recognise the people without any visual inputs and monitor physiological, behavioural & health analytics of the people.
SUMMARY OF THE INVENTION
According to a first aspect of the invention, there is provided a system for enabling collaborative environment between multiple devices using 6 Degree of Freedom (DOF) tracking and Internet of Things (IoT). The system comprises a centralized router having a 6DoF tracking unit; a streaming unit; an audio unit; a wireless network module to establish a wireless communication network; a processing module connected with the 6DoF tracking unit, the wireless network module and the Streaming unit; and a power module to provide electrical power to the centralized router. The system further comprises one or more devices selected from a first set of devices and/or a second set of devices, coupled with respective one or more 6DoF trackers, each 6DoF tracker being tracked by the 6DoF tracking unit of the centralized router in real-time. Further, the second set of devices are connected with the centralized router via the wireless communication network. Additionally, the processing module is configured to receive 6DoF tracking data of each of the one or more devices coupled with the respective 6DoF tracker, from the 6DoF tracking unit; determine and track 6DoF pose of each of the one or more devices with respect to the centralized router and with respect to other one or more devices; and enable a collaborative environment for the one or more devices to perform one or more operations individually and/or in combination, based on the 6DoF pose of each of the one or more devices.
In accordance with an embodiment of the present invention, the second set of devices are selected from one or more of electrical appliances, audio devices, display devices, Mixed Reality (MR) based Head Mounted devices (HMDs) and mobile computing devices. Accordingly, the processing module is configured to enable the collaborative environment for the one or more devices to perform one or more operations by receiving one or more commands in a form of voice commands, gestures and/or touch inputs from the second set of devices and/or the audio unit of the centralized router; and performing one or more functions of the connected second set of devices based on the received one or more commands.
In accordance with an embodiment of the present invention, the one or more functions include switching on/off & changing settings of second set of devices, playing/stopping music on audio devices & mobile computing devices, making & attending calls, messages on mobile computing devices, playing or changing video content on display devices and mobile computing devices, getting weather updates, getting news updates, setting alarms and controlling the electrical appliances.
In accordance with an embodiment of the present invention, the second set of devices connected with the centralized router are one or more HMDs. Accordingly, the processing module in combination with the Streaming unit is configured to enable the one or more devices to perform the one or more operations by streaming mixed reality content in a common mixed reality of space of the connected one or more HMDs, based on the 6DoF poses of each of the one or more devices, thereby enabling a collaborative mixed reality sessions for users.
In accordance with an embodiment of the present invention, the first set of devices are selected from mechanical tools, sports equipment, fitness equipment and household objects. Accordingly, the processing module in combination with the Streaming unit is configured to perform the one or more operations by generating 3D video content related to the first set of devices, fused with the 6DoF poses of each of the one or more devices; and display the 3D video content in one or more of the connected display devices, the mobile computing devices and in a mixed reality of space of the connected one or more HMDs.
In accordance with an embodiment of the present invention, the centralized router further comprises one or more sensors to enable the processing module to lock and unlock the centralized router using biometrics.
In accordance with an embodiment of the present invention, the 6DoF tracking unit (104) implements EM tracking, optical/visual tracking, visual-inertial tracking, WiFi based tracking, RF tracking, ultrasound-based tracking or any combination thereof.
In accordance with an embodiment of the present invention, the 6DoF tracking unit (104) comprises an RF – Radar on chip, in case of RF tracking. Further, the RF-Radar on chip further comprises a plurality of transceivers configured to send and receive a plurality of RF signals and accordingly enable the processing module to create a 4D image of the surroundings including the one or more devices (120) and the people within the surrounding, to detect a number of people in the surroundings, recognise the people without any visual inputs and monitor physiological and behavioural analytics of the people.
According to a second aspect of the present invention, there is provided a method for enabling collaborative environment between multiple devices using 6 Degree of Freedom (DOF) tracking and Internet of Things (IoT). The method comprises receiving 6DoF tracking data of one or more devices coupled with a respective 6DoF tracker at a centralized router; determining and tracking 6DoF pose of each of the one or more devices with respect to the centralized router and with respect to other one or more devices; and enabling the collaborative environment for the one or more devices to perform one or more operations individually and/or in combination, based on the 6DoF pose of each of the one or more devices. Further, the one or more devices selected from first set of devices and/or second set of devices, coupled with respective one or more 6DoF trackers, each 6DoF tracker being tracked by a 6DoF tracking unit of the centralized router in real-time. Additionally, the second set of devices are connected with the centralized router via a wireless communication network.
In accordance with an embodiment of the present invention, the second set of devices are selected from one or more of electrical appliances, audio devices, display devices, Mixed Reality (MR) based Head Mounted devices (HMDs) and mobile computing devices. Accordingly, the step of enabling the collaborative environment for the one or more devices to perform one or more operations comprises receiving one or more commands in a form of voice commands, gestures and/or touch inputs from the second set of devices and/or the audio unit of the centralized router; and performing one or more functions of the connected second set of devices based on the received one or more commands.
In accordance with an embodiment of the present invention, the one or more functions include switching on/off & changing settings of the second set of devices, playing/stopping music on audio devices & mobile computing devices, making & attending calls, messages on mobile computing devices, playing or changing video content on display devices and mobile computing devices, getting weather updates, getting news updates, setting alarms and controlling the electrical appliances.
In accordance with an embodiment of the present invention, the second set of devices connected with the centralized router are one or more HMDs. Accordingly, the step of enabling the collaborative environment for the one or more devices to perform one or more operations comprises a step of streaming mixed reality content in a common mixed reality of space of the connected one or more HMDs, based on the 6DoF poses of each of the one or more devices, thereby enabling a collaborative mixed reality sessions for users.
In accordance with an embodiment of the present invention, the first set of devices are selected from mechanical tools, sports equipment, fitness equipment and household objects. Accordingly, the step of enabling the collaborative environment for the one or more devices to perform one or more operations comprises generating 3D video content related to the first set of devices, fused with the 6DoF poses of each of the one or more devices; and display the 3D video content in one or more of the connected display devices, the mobile computing devices and in a mixed reality of space of the connected one or more HMDs.
In accordance with an embodiment of the present invention, the method further comprising steps of locking and unlocking the centralized router using biometrics.
In accordance with an embodiment of the present invention, the 6DoF tracking unit (104) implements EM tracking, optical/visual tracking, visual-inertial tracking, WiFi based tracking, RF tracking, ultrasound-based tracking or any combination thereof.
In accordance with an embodiment of the present invention, the 6DoF tracking unit (104) comprises an RF – Radar on chip, in case of RF tracking. Further, the RF-Radar on chip further comprises a plurality of transceivers configured to send and receive a plurality of RF signals and accordingly enable the centralized router (102) to create a 4D image of the surroundings including the one or more devices (120) and the people within, to detect a number of people in the surroundings, recognise the people without any visual inputs and monitor physiological and behavioural analytics of the people.
BRIEF DESCRIPTION OF THE DRAWINGS
So that the manner in which the above recited features of the present invention can be understood in detail, a more particular to the description of the invention, briefly summarized above, may be had by reference to embodiments, some of which are illustrated in the appended drawings. It is to be noted, however, that the appended drawings illustrate only typical embodiments of this invention and are therefore not to be considered limiting of its scope, the invention may admit to other equally effective embodiments.
These and other features, benefits and advantages of the present invention will become apparent by reference to the following text figure, with like reference numbers referring to like structures across the views, wherein:
Fig. 1 illustrates a system for enabling collaborative environment between multiple devices using 6 Degree of Freedom (DoF) tracking and Internet of things (IoT), in accordance with an embodiment of the present invention;
Fig. 2 illustrates a method for enabling collaborative environment between multiple devices using 6 Degree of Freedom (DoF) tracking and Internet of things (IoT)., in accordance with an embodiment of the present invention;
Fig. 3A-3B illustrate information flow diagrams for enabling collaborative environment between multiple devices using the system of Fig. 1 and the method of Fig. 2, in accordance with an embodiment of the present invention;
Fig. 4A illustrates an exemplary implementation of the system of Fig. 1 for enabling a collaborative environment between a second set of devices and performing multiple operations, in accordance with another embodiment of the present invention;
Fig. 4B illustrates an exemplary implementation of the system of Fig. 1 for enabling a collaborative environment between multiple Head Mounted Devices, in accordance with another embodiment of the present invention; and
Fig. 4C illustrates an exemplary implementation of the system of Fig. 1 for enabling a collaborative environment between a first set of devices and the second of devices, in accordance with another embodiment of the present invention.
DETAILED DESCRIPTION OF DRAWINGS
While the present invention is described herein by way of example using embodiments and illustrative drawings, those skilled in the art will recognize that the invention is not limited to the embodiments of drawing or drawings described and are not intended to represent the scale of the various components. Further, some components that may form a part of the invention may not be illustrated in certain figures, for ease of illustration, and such omissions do not limit the embodiments outlined in any way. It should be understood that the drawings and detailed description thereto are not intended to limit the invention to the particular form disclosed, but on the contrary, the invention is to cover all modifications, equivalents, and alternatives falling within the scope of the present invention as defined by the appended claims. As used throughout this description, the word "may" is used in a permissive sense (i.e. meaning having the potential to), rather than the mandatory sense, (i.e. meaning must). Further, the words "a" or "an" mean "at least one” and the word “plurality” means “one or more” unless otherwise mentioned. Furthermore, the terminology and phraseology used herein is solely used for descriptive purposes and should not be construed as limiting in scope. Language such as "including," "comprising," "having," "containing," or "involving," and variations thereof, is intended to be broad and encompass the subject matter listed thereafter, equivalents, and additional subject matter not recited, and is not intended to exclude other additives, components, integers or steps. Likewise, the term "comprising" is considered synonymous with the terms "including" or "containing" for applicable legal purposes. Any discussion of documents, acts, materials, devices, articles and the like is included in the specification solely for the purpose of providing a context for the present invention. It is not suggested or represented that any or all of these matters form part of the prior art base or were common general knowledge in the field relevant to the present invention.
In this disclosure, whenever a composition or an element or a group of elements is preceded with the transitional phrase “comprising”, it is understood that we also contemplate the same composition, element or group of elements with transitional phrases “consisting of”, “consisting”, “selected from the group of consisting of, “including”, or “is” preceding the recitation of the composition, element or group of elements and vice versa.
The present invention is described hereinafter by various embodiments with reference to the accompanying drawings, wherein reference numerals used in the accompanying drawing correspond to the like elements throughout the description. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiment set forth herein. Rather, the embodiment is provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those skilled in the art. In the following detailed description, numeric values and ranges are provided for various aspects of the implementations described. These values and ranges are to be treated as examples only and are not intended to limit the scope of the claims. In addition, a number of materials are identified as suitable for various facets of the implementations. These materials are to be treated as exemplary and are not intended to limit the scope of the invention.
Figure 1 illustrates a system (100) for enabling a collaborative environment between multiple devices using 6 Degree of Freedom (DoF) tracking and Internet of things (IoT), in accordance with an embodiment of the present invention. Herein the “collaborative environment” may be understood as an environment wherein multiple devices (such as internet enabled devices, non-computing devices, objects etc.) may interact with each other by utilising real-time 6DoF tracking of each device and a wireless network such as internet.
Referring to figure 1, the system (100) comprises a centralized router (102) placed a predetermined location, that may be, but not limited to, a room, floor, home, office, mall or any other space where the collaborative environment is desired to be generated. Further, the system (100) comprises one or more devices (120) that either tracked by the centralized router (102) or are wirelessly connected with the centralized router (102) or both.
In accordance with an embodiment of the present invention, the centralized router (102) comprises a 6 Degree of Freedom (DoF) tracking unit (104), a Streaming unit (112), an audio unit (108), a wireless network module (106), a processing module (110) and a power module (114). The 6 degrees of freedom include motion in x axis, y axis, z axis as well as yaw, pitch and roll motion. The 6DoF tracking may be implemented using techniques such as, but not limited to, Electromagnetic (EM) tracking, optical/visual tracking, visual-inertial tracking, WiFi based tracking, RF tracking, ultrasound-based tracking or a combination thereof. In one embodiment, the 6 DoF tracking unit (104) is a beyond line of sight communication-based tracking unit (104) using a combination of EM tracking a well as RF tracking. In that sense the 6DoF tracking unit (104) may include, but not limited to, an EM emitter with an in-built RF antenna, that emits an EM field and an amplifier. Additionally, the 6DoF tracking unit (104) comprises a RF-Radar on chip that houses a plurality of transceivers that send & receive a plurality of signals. Using the EM tracking and the RF tracking is advantageous as tracking can be done beyond the line the sight and irrespective of the lighting conditions.
Further, the streaming unit (112) is envisaged to be capable of streaming 3D video content fused with 6-DoF data directly to various displays and Head Mounted Devices (HMDs) wirelessly. In addition, the audio unit (108) includes an array of microphones and one or more speakers (not shown). The one or more microphones are configured to capture binaural audio along the motion of a user, that may include voice commands for performing certain operations. The one or more speakers are envisaged to provide output audio from the centralized router (102) in response to the commands or for providing any other information. The audio unit (108) may implement various noise cancellation techniques to further enhance audio quality. In one embodiment, the audio unit (108) may also include voice recognition module to allow only authorised users to operate the centralized router (102).
Further, the wireless network module (106) configured to establish a wireless communication network (116) to enable a wireless communication with the one or more devices (120). In that sense, the wireless network module (106) may include one or more of, but not limited to, a WiFi module and a GSM/GPRS module. Therefore, the wireless communication network (116) may be, but not limited to, wireless intranet network, WIFI internet or GSM/GPRS based 2G, 3G, 4G, LTE or 5G communication network. Preferably, the wireless communication network (116) is internet. In one embodiment, the centralized router (102) is itself connected with the internet.
In addition, the processing module (110) is connected with the each of the 6DoF tracking unit (104), the wireless network module (106) and the streaming unit (112). Herein, the processing module (110) is envisaged to include computing capabilities such as a memory unit configured to store machine readable instructions. The machine-readable instructions may be loaded into the memory unit from a non-transitory machine-readable medium, such as, but not limited to, CD-ROMs, DVD-ROMs and Flash Drives. Alternately, the machine-readable instructions may be loaded in a form of a computer software program into the memory unit. The memory unit in that manner may be selected from a group comprising EPROM, EEPROM and Flash memory. Further, the processing module (110) includes a processor operably connected with the memory unit. In various embodiments, the processor is one of, but not limited to, a general-purpose processor, an application specific integrated circuit and a field-programmable gate array. In one embodiment the processing module (110) may be a part of a dedicated computing device or may be a microprocessor that may be a multipurpose, clock driven, register based, digital integrated circuit that accepts binary data as input, processes it according to instructions stored in its memory and provides results as output.
The processing module (110) may further implement artificial intelligence and machine learning based technologies for, but not limited to, data analysis, collating data and presentation of data in real-time. In accordance with an embodiment of the present invention, a data repository (not shown) may be also be connected with the system (100). The data repository may be, but not limited to, a local or a cloud-based storage. The data repository may store 3D video content and mixed reality content which may be provide to the processing module (110) and the streaming unit (112) when queried using appropriate protocols. In accordance with another embodiment of the present invention, the centralized router (102) comprises one or more sensors (not shown) to enable the processing module (110) to lock and unlock the centralized router (102). In that sense, the one or more sensors may include, biometric sensors for fingerprint scan, facial recognition and iris/retina recognition for enhanced security.
Moreover, the power module (114) is configured to provide electrical power to the tracking unit (104), the streaming unit (112), the audio unit (108), the wireless network module (106), the processing module (110) and one or more sensors. In that sense, the power module (114) may be, a rechargeable battery or a non-rechargeable battery. In another embodiment, the power module (114) is configured to be connected with an electrical socket to receive electrical power.
In accordance with an embodiment of the present invention, the centralized router (102) may be shaped as hollow box like enclosing all the above-mentioned components, made from a material such as, but not limited to, plastic, metal or a polymer. It may be provided with a mounting mechanism or stand for easy placement of the centralized router (102) at a central position at a particular location.
Further included in the system (100) are the one or more devices (120). The one or more devices (120) are selected from a first set of devices (1202) and/or a second set of devices (1204). The first set of devices (1202) may be devices that do not have an internet connection and are not wirelessly connected with the centralized router (102). In that sense, the one or more first set of devices (1202) are selected from, but not limited to, mechanical tools, sports equipment, fitness equipment and household objects. As shown in figure 1, these may be a cricket/baseball bat, racket, hammer, household objects and kitchen tools like spoon, knife, bottles etc.
Whereas, the second set of devices (1204) are the smart devices that may be connected with the centralized router (102) via the wireless communication network (116) established by the centralized router (102). In that sense, the second set of devices (1204) are selected from one or more of, but not limited to, electrical appliances, audio devices, display devices, Mixed Reality (MR) based Head Mounted devices (HMDs) and mobile computing devices. As shown in figure 1, the second set of devices (1204) may be a microwave, smart TV, a smartphone, a smart Air Conditioner (AC), smart lighting and fans etc. All these second set of devices (1204) are connected with the centralized router (102) wirelessly and are capable of being operated remotely.
Both the first set of devices (1202) and the second set of devices (1204) are envisaged to be connected with respective 6DoF trackers (118). A 6DoF tracker (118) may simply be attached with each of the first set of devices (1202) and the second set of devices (1204). The 6DoF tracker (118) is adapted to be tracked by the 6DoF tracking unit (104) of the centralized router (102). The 6DoF sensors may include one or more components such as, but not limited to, a magnetic coil, ADC and an Inertial Measurement Unit (IMU) comprising an accelerometer, gyroscope and a magnetometer. The magnetic field emitted by the EM emitter of the 6DoF tracking unit (104) of the centralized router (102) is sensed by magnetic coil. A corresponding voltage may then be generated to calculate 6-DoF pose of the object with which the 6DoF tracker (118) is attached. The IMU may be used to compensate jitter (if any) to correct the 6-DoF pose. After that, all the 6DoF trackers (118) send the respective fused pose back to the 6DoF tracking unit (104) over an RF link. This process takes place in real-time.
Apart from this, the RF-Radar on chip provided in the centralized router is configured to send and receive a plurality of RF signals and accordingly enable the processing module to create a high resolution 4D image of the surroundings including the one or more devices (120) and the people within, by analysing the received signals. So, all the devices and the people within the surroundings are also being tracked using RF-Radar on chip.
In one embodiment, if the second set of devices (1204) such as a smart phone etc. is not connected with the centralized router (102) via the wireless communication network (116) and is only being tracked by the attached 6DoF tracker (118), then the smartphone may be considered as a one of the first set of devices (1202) i.e. like any other object.
Additionally, the Mixed Reality based HMDs referred herein may be envisaged to include capabilities of generating an augmented reality environment, mixed reality environment and a virtual reality environment that lets a user interact with digital content within the environment generated in the HMD. Even the specification mostly states that HMD being a mixed reality-based HMD, but it will be appreciated by a skilled addressee that any Augmented or Virtual reality-based HMD may be used without departing from the scope of the present invention. The HMD is envisaged to be worn by the user and therefore, may be provided with, but not limited to, one or more bands, straps and locks for mounting on the head; or may even be provided as smart glasses to be worn just like spectacles. It will be understood by a person skilled in the art that below mentioned components of the HMD and their description should be considered as exemplary and not in a strict sense. The HMD may include components selected from, but not limited to, an optical unit having one or more lenses, one or more reflective mirrors & a display unit; a sensing unit having one or more sensors & an image acquisition device; an HMD audio unit comprising one or more speakers and one or more microphones; a user interface; a wireless communication module and a microprocessor.
In accordance with an embodiment of the present invention, the optical unit is envisaged to provide a high resolution and wider field of view. The display unit may comprise a Liquid Crystal on Silicon display and a visor. In accordance with an embodiment of the present invention, the one or more sensors may selected from, but not limited to, RGB sensor, a depth sensor, an eye tracking sensor, an EM sensor, ambient light sensor, an accelerometer, a gyroscope and a magnetometer.
Furthermore, the image acquisition device is selected from one or more of, but not limited to, omnidirectional cameras, wide angle stereo vision camera, RGB-D camera, digital cameras, thermal cameras, Infrared cameras and night vision cameras. In accordance with an embodiment of the present invention, the one or more microphones in the HMD audio unit is configured to capture binaural audio along the motion of the user and 3D stereo sound with acoustic source localization with the help of IMU. The HMD audio unit may also implement background noise cancellation techniques to further enhance the experience. Furthermore, the one or more speakers may have an audio projection mechanism that projects sound directly to the concha of an ear of the user and reaches an ear canal after multiple reflections.
In accordance with an embodiment of the present invention, the one or more ports configured to enable wired connection between one or more external sources and the HMD. The one or more ports may be, but not limited to, micro-USB port, USB Type-C ports and HDMI ports. Further, the wireless communication module is configured to connect with the wireless communication network (116) to enable wireless communication between the centralized router (102) and the HMD. Additionally, it may also connect the HMD with other available wireless networks for sending and receiving information wirelessly.
Further, the HMD includes a microprocessor that may be a multipurpose, clock driven, register based, digital integrated circuit that accepts binary data as input, processes it according to instructions stored in its memory and provides results as output. The microprocessor may contain both combinational logic and sequential digital logic. The microprocessor is the brain of the HMD and is configured to facilitate operation of each of the components of the HMD. The HMD may also implement artificial intelligence and machine learning based technologies for, but not limited to, data analysis, collating data and presentation of data in real-time.
Figure 2 illustrates a method (200) for enabling collaborative environment between multiple devices using 6 Degree of Freedom (DoF) tracking and Internet of things (IoT), in accordance with an embodiment of the present invention. As shown in figure 2, the method (200) starts at step 210 by receiving 6DoF tracking data of one or more devices (120) coupled with the respective 6DoF tracker (118) at the centralized router (102). The same has been illustrated in figure 3A. As previously mentioned, the one or more devices (120) are selected from the first set of devices (1202) and/or the second set of devices (1204). Each device is being tracked by the 6DoF tracking unit (104) of the centralized router (102). In one embodiment, each 6DoF tracker (118) has a unique hardware ID for easy identification by the processing module (110). The first set of devices (1202) are connected with the internet and may be selected from, but not limited to, mechanical tools, sports equipment, fitness equipment and household objects. Additionally, the second set of devices (1204) are selected from one or more of, but not limited to, electrical appliances, audio devices, display devices, HMDs and mobile computing devices. So, the centralized unit receives 6DoF tracking data of each of the first set of devices (1202) and the second set of devices (1204) from the respective 6DoF trackers (118). Apart from that, the second set of devices (1204) are also connected with the centralized router (102) via the wireless communication network (116).
Further at step 220, the processing module (110) is configured to determine and track the 6DoF pose of each of the one or more devices (120), with respect to the centralized router (102) and with respect to other one or more devices (120). Herein, the magnetic field emitted by the EM emitter of the 6DoF tracking unit (104) of the centralized router (102) is sensed by the magnetic coil. A corresponding voltage may then be generated to calculate the 6DoF pose of the object with which the 6DoF tracker (118) is attached. Herein, the 6DoF pose is envisaged to include position, orientation etc. The IMU may be used to compensate jitter (if any) to correct the 6DoF pose. After that, all the 6DoF trackers (118) send the respective fused pose back to the 6DoF tracking unit (104) over an RF link. This process takes place in real-time. This way both the first set of devices (1202) and the second set of devices (1204) are tracked in real-time.
Apart from this, the RF-Radar on chip provided in the centralized router (102) is configured to send and receive a plurality of RF signals and accordingly enable the processing module to create a high resolution 4D image of the surroundings including the one or more devices (120) and the people within, by analysing the received signals. So, all the one or more devices (120) and the people within the surroundings are also being tracked using RF-Radar on chip.
After that, at step 230, the processing module (110) enables a collaborative environment (302) for the one or more devices (120) to perform one or more operations individually and/or in combination, based on the 6DoF pose of each of the one or more devices (120). The same has been illustrated in figure 3B. The collaborative environment (302) can be understood as collaboration of each of the first set of devices (1202) and/or the second set of devices (1204), irrespective of whether they have network connection or not. All the devices may be incorporated within the collaborative environment (302) if they have the respective 6DoF tracker (118) attached. Herein one of the second set of devices (1204) may perform & control one or more operations of the other second set of devices (1204) and the first set of devices (1202). For Example: The processing module (110) may receive one or more commands in a form of voice commands, gestures and/or touch inputs from the second set of devices (1204) (say, HMD or a smart phone) and/or the audio unit (108) of the centralized router (102). Accordingly, one or more functions of the connected second set of devices (1204) are performed based on the received one or more commands. The one or more functions may include, but not limited to, switching on/off & changing settings of second set of devices (1204), playing/stopping music on audio devices & mobile computing devices, making & attending calls, messages on mobile computing devices, playing or changing video content on display devices and mobile computing devices, getting weather updates, getting news updates, setting alarms and controlling the electrical appliances.
Additionally, signals sent and received by the RF-Radar on chip are analysed by the processing module are not just used for 4D image generation or tracking but also to detect a number of people in the surroundings, recognise the people without any visual inputs and monitor physiological and behavioural analytics of the people. The physiological analytics include, but not limited to, gait analysis, breathing monitoring, posture detection and sleep analysis. The behavioural analytics include, but not limited to, activity levels, wake and sleep times, nocturnal roaming and bathroom usage. The RF-radar on chip also enables the processing module to do mood monitoring and health analytics by sensing temperature profiles, breathing and heart rate of people (without any trackers on people). Health monitoring is done by analysing vital signs is done without visual inspection, which means no privacy intrusion. Remote family members can be notified if health of any family member in the house deteriorates using these analytics. For contact details of the family members are pre-stored in the system. Furthermore, this also enables tracking of hand gestures and body gestures of people without any visual inputs of wearable trackers for people and also for MR experiences, pose estimation, games etc.
The above method (200) would be understood more clearly with the help of examples and real-life implementations. So, figure 4A-4C illustrate multiple exemplary embodiments of the present system (100) and method (200) covering various aspects, variations and applications.
Figure 4A illustrates an exemplary implementation (410) of the system (100) of Fig. 1 for enabling the collaborative environment (302) between the second set of devices (1204) and performing multiple operations, in accordance with another embodiment of the present invention. In this example, it is assumed that there are only second set of devices (1204) connected with the centralized router (102) and the centralized router (102) has generated a collaborative environment (302) for the second set of devices (1204) to operate. As shown in figure 4A, the second set of devices (1204) may be the smart phone, the HMD, the air conditioner, the TV and the microwave. Herein the smartphone and the HMD, individually or in combination are being used to operate the air conditioner, the TV and the microwave. Each of the second set of devices (1204) shown, has a 6DoF tracker (118) attached to it and is tracked by centralized router (102). Alternately or in combination, the RF-Radar on chip may be used for tracking the second set of devices (1204) herein.
The smartphone and/or HMD may give one or more commands in a form of voice commands, gestures and/or touch inputs to perform and control one or more functions of the air conditioner, the TV and the microwave. Additionally, the user may simply give one or more commands as voice input to the centralized router (102) such as “Switch on the TV and turn of the AC” and consequently the both the operation are carried out on TV and the AC respectively. Besides, the user may simply ask the centralized router (102) “How’s the weather today?” and the centralized router (102) may fetch the required information regarding temperature, humidity and other forecast from the internet. The same information may be delivered via the one or more speakers of the audio unit (108) or also as notification on the connected smartphone/HMD. The wireless network module (106) may connect the processing module (110) and thereby the centralized router (102) to a cloud server via wireless communication network (116), to process the voice commands and perform the action accordingly.
Figure 4B illustrates an exemplary implementation (420) of the system (100) of Fig. 1 for enabling the collaborative environment between multiple Head Mounted Devices, in accordance with an embodiment of the present invention. In this example, it is assumed that there is a first user (406) and a second user (408) wearing respective HMDs (i.e. second set of devices (1204)) and holding respective tool (i.e. first set of devices (1202)) in their hands. The HMDs are connected with the centralized router (102) via the wireless communication network (116). Each of the HMDs and the tools also have their respective 6DoF trackers (118). Alternately or in combination, the RF-Radar on chip may be used for tracking the first user (406) and the second user (408) wearing respective HMDs (i.e. second set of devices (1204)) and holding respective tool (i.e. first set of devices (1202)) in their hands. So, the centralized router (102) receives 6DoF tracking data of the each of the HMDs and the tools, and the processing module (110) determines the respective 6Dof poses. Accordingly, the streaming unit (112) is configured to provide mixed reality content (404) fused with respective 6DoF poses in a common mixed reality space (402) for each of the HMD.
The 6DOF poses of each distinctive 6DoF tracker mounted on the HMD are automatically synced with the centralized router’s (102) 6DoF tracking unit (104) within real time scenarios. As shown in figure 4B, the first user (406) and the second user (408) are working a together on an architectural structure wherein the mixed reality content (404) is provided to each of the HMDs by the Streaming unit (112) based on the respective 6DoF poses. Also, the first user (406) and the second user (408) are able to use the tool for construction work in the common mixed reality space (402) shared by both the users. In one embodiment, the HMD (along with its sensors and cameras) may also assist in tracking the tool in the common mixed reality space (402) and provide spatial sound effects. Such features enable the present invention to be used in the filed of education, teaching, medical studies, surgery, architecture, civil engineering, mechanical engineering, office meetings, making strategies, project planning etc.
Figure 4C illustrates an exemplary implementation (430) of the system (100) of Fig. 1 for enabling a collaborative environment between the first set of devices (1202) and the second of devices, in accordance with an embodiment of the present invention. As shown in figure 4C, it is assumed that a user (410) wearing the HMD (i.e. the second set of devices (1204)) is holding a baseball bat (i.e. the first set of devices (1202)) in his hands. Each of the HMD and the baseball bat has a respective 6DoF tracker (118). The HMD is connected with the centralized router (102) via the wireless communication network (116). So, in this scenario, the centralized router (102) is receives the 6DoF tracking data of the HMD and the baseball bat and determines the respective 6DoF poses. Alternately or in combination, the RF-Radar on chip may be used for tracking both the baseball bat and the user (410). Additionally, the centralized unit may determine the first set of device being a baseball bat using the HMD and accordingly provide 3D video content/mixed reality content related to the baseball, in the mixed reality space (402). The 3D video content/mixed reality content may be a virtual game created in the mixed reality space (402) wherein a ball is being thrown at the user (410) and he has to hit the ball with the baseball bat in his hands. The same has been illustrated in figure 4C. This may actually be a simulation of a real baseball game. Additionally, the user (410) may choose to play the game in virtual reality with a stadium-like virtual environment or in mixed reality with content being overlaid on visible surroundings, like in playing in an empty space in the house.
Additionally, in another example (not shown), instead of a baseball bat, the user may hold a cricket bat, badminton racket, table-tennis racket etc. and accordingly the 3D video content/mixed reality content fused with respective 6DoF poses comprising real game simulation may be generated in the mixed reality space. It may also include multiple modes such as practice mode and game mode. Furthermore, not just the sports equipment but also the household objects such as fork, knife, bottle etc. may be used in certain game simulations as a sports equipment. For example: a table spoon may be used as table-tennis racket in the table-tennis game simulation in the mixed reality space of the HMD, wherein the user is holding the table spoon but seeing the racket in his hand in the mixed reality spaces. All such simulations may be pre-stored in the data repository and may be updated from time to time. Further, these simulation can also be played with multiple users wearing respective HMDs in a common mixed reality space and/or in separate mixed reality space at the same time. For example: two users may be playing table tennis on their own table (with 6DoF tracker (118) attached). Also, multiple HMDs connected with the same centralized router (102) may also receive different 3D video content comprising different simulations at the same time. These features enable the present invention to be used AR/VR/MR gaming and offer unlimited opportunities for game developers to take advantage of.
In another embodiment (not shown), when the second set of devices (1204) connected to centralized router (102) include HMD as well as smartphones, laptops or PCs, then videos from each of the above-mentioned devices may be rendered on the other connected devices. For example: a video from a smart phone may be played in the mixed reality space of the connected HMD or vice versa.
The present invention offers a number of advantages:
1. Low Processing and Inexpensive.
2. Easy control of smart appliances using smartphone, HMD, voice commands etc.
3. Multi-User MR/AR/VR Experience.
4. Works well in both Indoor-Outdoor Scenarios.
5. Collaborative MR/XR Experience.
6. Centralized computing and rendering of high-quality mixed reality content.
7. Luminosity insensitive.
8. Persistent tracking within range.
9. AI Enabled Sensor fusion.
10. Can perform physiological, behavioural and health analytics of the people detected within the surroundings.
11. Monitors real-time activity, with a high degree of privacy.
12. Works in all lighting and weather conditions.
13. Can be installed in a concealed location.
14. Detects objects accurately and within a wide range.
15. Customizable systems to accommodate different needs and requirements.
Furthermore, the existence of vision based 6DOF Pose tracking comes a privacy issue, so the present invention uses non visual methods for 6DOF Pose tracking that helps the system to treat user’s privacy as the most important concern. Additionally, any low-cost HMD or any other smart or ordinary device/object can be tracked for 6DOF pose estimation using just a 6DoF pose tracker and turned into a controller. Alternately or in combination, the RF-Radar on chip may be used for tracking the one or more devices (120). Since, the present invention is not dependent on light and occlusion, hence hassle-free pose tracking is achieved.
The present invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments explained herein above. Rather, the embodiment is provided so that this disclosure will be thorough and complete and will fully convey the scope of the invention to those skilled in the art.
Further, one would appreciate that the wireless communication network used in the system can be a short-range communication network and/or a long-range communication network, wire or wireless communication network. The communication interface includes, but not limited to, a serial communication interface, a parallel communication interface or a combination thereof.
The Head Mounted Devices (HMDs) referred herein may also include more components such as, but not limited to, a Graphics Processing Unit (GPU) or any other graphics generation and processing module. The GPU may be a single-chip processor primarily used to manage and boost the performance of video and graphics such as 2-D or 3-D graphics, texture mapping, hardware overlays etc. The GPU may be selected from, but not limited to, NVIDIA, AMD, Intel and ARM for real time 3D imaging.
In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, written in a programming language, such as, for example, Java, C, Python or assembly. One or more software instructions in the modules may be embedded in firmware, such as an EPROM. It will be appreciated that modules may comprised connected logic units, such as gates and flip-flops, and may comprise programmable units, such as programmable gate arrays or processors. The modules described herein may be implemented as either software and/or hardware modules and may be stored in any type of computer-readable medium or other computer storage device.
Further, while one or more operations have been described as being performed by or otherwise related to certain modules, devices or entities, the operations may be performed by or otherwise related to any module, device or entity. As such, any function or operation that has been described as being performed by a module could alternatively be performed by a different server, by the cloud computing platform, or a combination thereof. It should be understood that the techniques of the present disclosure might be implemented using a variety of technologies. For example, the methods described herein may be implemented by a series of computer executable instructions residing on a suitable computer readable medium. Suitable computer readable media may include volatile (e.g. RAM) and/or non-volatile (e.g. ROM, disk) memory, carrier waves and transmission media. Exemplary carrier waves may take the form of electrical, electromagnetic or optical signals conveying digital data steams along a local network or a publicly accessible network such as the Internet.
It should also be understood that, unless specifically stated otherwise as apparent from the following discussion, it is appreciated that throughout the description, discussions utilizing terms such as "controlling" or "obtaining" or "computing" or "storing" or "receiving" or "determining" or the like, refer to the action and processes of a computer system, or similar electronic computing device, that processes and transforms data represented as physical (electronic) quantities within the computer system's registers and memories into other data similarly represented as physical quantities within the computer system memories or registers or other such information storage, transmission or display devices.
Various modifications to these embodiments are apparent to those skilled in the art from the description and the accompanying drawings. The principles associated with the various embodiments described herein may be applied to other embodiments. Therefore, the description is not intended to be limited to the embodiments shown along with the accompanying drawings but is to be providing broadest scope of consistent with the principles and the novel and inventive features disclosed or suggested herein. Accordingly, the invention is anticipated to hold on to all other such alternatives, modifications, and variations that fall within the scope of the present invention and the appended claims.
,CLAIMS:
1. A system (100) for enabling collaborative environment between multiple devices using 6 Degree of Freedom (DOF) tracking and Internet of Things (IoT), the system (100) comprising:
a centralized router (102) comprising:
a 6DoF tracking unit (104);
a streaming unit (112);
an audio unit (108);
a wireless network module (106) to establish a wireless communication network (116);
a processing module (110) connected with the 6DoF tracking unit (104), the wireless network module (106) and the Streaming unit (112);
a power module (114) to provide electrical power to the centralized router (102); and
one or more devices (120) selected from a first set of devices (1202) and/or a second set of devices (1204), coupled with respective one or more 6DoF trackers (118), each 6DoF tracker (118) being tracked by the 6DoF tracking unit (104) of the centralized router (102) in real-time;
wherein the second set of devices (1204) are connected with the centralized router (102) via the wireless communication network (116);
wherein the processing module (110) is configured to:
receive 6DoF tracking data of each of the one or more devices (120) coupled with the respective 6DoF tracker (118), from the 6DoF tracking unit (104);
determine and track 6DoF pose of each of the one or more devices (120) with respect to the centralized router (102) and with respect to other one or more devices (120);
enable a collaborative environment for the one or more devices (120) to perform one or more operations individually and/or in combination, based on the 6DoF pose of each of the one or more devices (120).
2. The system (100) as claimed in claim 1, wherein the second set of devices (1204) are selected from one or more of electrical appliances, audio devices, display devices, Mixed Reality (MR) based Head Mounted devices (HMDs) and mobile computing devices; and
wherein the processing module (110) is configured to enable the collaborative environment for the one or more devices (120) to perform one or more operations by:
receiving one or more commands in a form of voice commands, gestures and/or touch inputs from the second set of devices (1204) and/or the audio unit (108) of the centralized router (102);
performing one or more functions of the connected second set of devices (1204) based on the received one or more commands.
3. The system (100) as claimed in claim 2, wherein the one or more functions include switching on/off & changing settings of second set of devices (1204), playing/stopping music on audio devices & mobile computing devices, making & attending calls, messages on mobile computing devices, playing or changing video content on display devices and mobile computing devices, getting weather updates, getting news updates, setting alarms and controlling the electrical appliances.
4. The system (100) as claimed in claim 1, wherein the second set of devices (1204) connected with the centralized router (102) are one or more HMDs;
wherein the processing module (110) in combination with the Streaming unit (112) is configured to enable the one or more devices (120) to perform the one or more operations by streaming mixed reality content in a common mixed reality of space of the connected one or more HMDs, based on the 6DoF poses of each of the one or more devices (120), thereby enabling a collaborative mixed reality sessions for users.
5. The system (100) as claimed in claim 2, wherein the first set of devices (1202) are selected from mechanical tools, sports equipment, fitness equipment and household objects;
wherein the processing module (110) in combination with the Streaming unit (112) is configured to perform the one or more operations by:
generating 3D video content related to the first set of devices (1202), fused with the 6DoF poses of each of the one or more devices (120); and
displaying the 3D video content in one or more of the connected display devices, the mobile computing devices and in a mixed reality of space of the connected one or more HMDs.
6. The system (100) as claimed in claim 1, wherein the centralized router (102) further comprises one or more sensors to enable the processing module (110) to lock and unlock the centralized router (102) using biometrics.
7. The system (100) as claimed in claim 1, wherein the 6DoF tracking unit (104) implements EM tracking, optical/visual tracking, visual-inertial tracking, WiFi based tracking, RF tracking, ultrasound-based tracking or any combination thereof.
8. The system (100) as claimed in claim 7, wherein the 6DoF tracking unit (104) comprises an RF – Radar on chip, in case of RF tracking;
wherein the RF-Radar on chip further comprises a plurality of transceivers configured to send and receive a plurality of RF signals and accordingly enable the processing module to create a 4D image of the surroundings including the one or more devices (120) and the people within the surrounding, to detect a number of people in the surroundings, recognise the people without any visual inputs and monitor physiological and behavioural analytics of the people.
9. A method (200) for enabling collaborative environment between multiple devices using 6 Degree of Freedom (DOF) tracking and Internet of Things (IoT), the method (200) comprising:
receiving 6DoF tracking data of one or more devices (120) coupled with a respective 6DoF tracker (118) at a centralized router (102);
determining and tracking 6DoF pose of each of the one or more devices (120) with respect to the centralized router (102) and with respect to other one or more devices (120);
enabling the collaborative environment for the one or more devices (120) to perform one or more operations individually and/or in combination, based on the 6DoF pose of each of the one or more devices (120);
wherein one or more devices (120) selected from first set of devices (1202) and/or second set of devices (1204), coupled with respective one or more 6DoF trackers (118), each 6DoF tracker (118) being tracked by a 6DoF tracking unit (104) of the centralized router (102) in real-time;
wherein the second set of devices (1204) are connected with the centralized router (102) via a wireless communication network (116).
10. The method (200) as claimed in claim 9, wherein the second set of devices (1204) are selected from one or more of electrical appliances, audio devices, display devices, Mixed Reality (MR) based Head Mounted devices (HMDs) and mobile computing devices; and
wherein the step of enabling the collaborative environment for the one or more devices (120) to perform one or more operations comprises:
receiving one or more commands in a form of voice commands, gestures and/or touch inputs from the second set of devices (1204) and/or the audio unit (108) of the centralized router (102); and
performing one or more functions of the connected second set of devices (1204) based on the received one or more commands.
11. The method (200) as claimed in claim 10, wherein the one or more functions include switching on/off & changing settings of the second set of devices (1204), playing/stopping music on audio devices & mobile computing devices, making & attending calls, messages on mobile computing devices, playing or changing video content on display devices and mobile computing devices, getting weather updates, getting news updates, setting alarms and controlling the electrical appliances.
12. The method (200) as claimed in claim 9, wherein the second set of devices (1204) connected with the centralized router (102) are one or more HMDs;
wherein the step of enabling the collaborative environment for the one or more devices (120) to perform one or more operations comprises a step of streaming mixed reality content in a common mixed reality of space of the connected one or more HMDs, based on the 6DoF poses of each of the one or more devices (120), thereby enabling a collaborative mixed reality sessions for users.
13. The method (200) as claimed in claim 10, wherein the first set of devices (1202) are selected from mechanical tools, sports equipment, fitness equipment and household objects;
wherein the step of enabling the collaborative environment for the one or more devices (120) to perform one or more operations comprises:
generating 3D video content related to the first set of devices (1202), fused with the 6DoF poses of each of the one or more devices (120); and
displaying the 3D video content in one or more of the connected display devices, the mobile computing devices and in a mixed reality of space of the connected one or more HMDs.
14. The method (200) as claimed in claim 9, further comprising steps of locking and unlocking the centralized router (102) using biometrics.
15. The method (200) as claimed in claim 9, wherein the 6DoF tracking unit (104) implements EM tracking, optical/visual tracking, visual-inertial tracking, WiFi based tracking, RF tracking, ultrasound-based tracking or any combination thereof.
16. The method (200) as claimed in claim 15, wherein the 6DoF tracking unit (104) comprises an RF – Radar on chip, in case of RF tracking;
wherein the RF-Radar on chip further comprises a plurality of transceivers configured to send and receive a plurality of RF signals and accordingly enable the centralized router (102) to create a 4D image of the surroundings including the one or more devices (120) and the people within, to detect a number of people in the surroundings, recognise the people without any visual inputs and monitor physiological and behavioural analytics of the people.
| # | Name | Date |
|---|---|---|
| 1 | 201921024012-PROVISIONAL SPECIFICATION [17-06-2019(online)].pdf | 2019-06-17 |
| 2 | 201921024012-OTHERS [17-06-2019(online)].pdf | 2019-06-17 |
| 3 | 201921024012-FORM FOR STARTUP [17-06-2019(online)].pdf | 2019-06-17 |
| 4 | 201921024012-FORM FOR SMALL ENTITY(FORM-28) [17-06-2019(online)].pdf | 2019-06-17 |
| 5 | 201921024012-FORM 1 [17-06-2019(online)].pdf | 2019-06-17 |
| 6 | 201921024012-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [17-06-2019(online)].pdf | 2019-06-17 |
| 7 | 201921024012-PostDating-(17-06-2020)-(E-6-136-2020-MUM).pdf | 2020-06-17 |
| 8 | 201921024012-APPLICATIONFORPOSTDATING [17-06-2020(online)].pdf | 2020-06-17 |
| 9 | 201921024012-FORM 3 [17-07-2020(online)].pdf | 2020-07-17 |
| 10 | 201921024012-ENDORSEMENT BY INVENTORS [17-07-2020(online)].pdf | 2020-07-17 |
| 11 | 201921024012-DRAWING [17-07-2020(online)].pdf | 2020-07-17 |
| 12 | 201921024012-COMPLETE SPECIFICATION [17-07-2020(online)].pdf | 2020-07-17 |
| 13 | Abstract1.jpg | 2021-10-19 |
| 14 | 201921024012-FORM-26 [21-10-2021(online)].pdf | 2021-10-21 |