Sign In to Follow Application
View All Documents & Correspondence

Device And Method For Dynamic Ray Casting

Abstract: The present disclosure provides an electronic device (102) and a method (400) for selection extending/casting a ray (208) to determine a hit point (210). The method (400) includes generating (402) a three-dimensional (3D) reconstruction of an environment using sensors of the electronic device (102), determining (404) an angle of the electronic device (102) with respect to a reference surface, and determining (406) an interpolation value based on the angle. Further, method (400) includes determining (408) at least one of an origin point (206) or a deviation value based on the interpolation value, extending (410) the ray (208) to intersect with an opposing surface on the 3D reconstruction based on at least one of the origin point (206) or the deviation value, and determining (412) an intersection of the ray (208) with the opposing surface as the hit point (210).

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
08 March 2024
Publication Number
37/2025
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application

Applicants

Flipkart Internet Private Limited
Building Alyssa Begonia & Clover, Embassy Tech Village, Outer Ring Road, Devarabeesanahalli Village, Bengaluru - 560103, Karnataka, India.

Inventors

1. SINGLA, Nischay
218, Sector 19-1, Huda, Kaithal, Haryana - 136027, India.

Specification

Description:TECHNICAL FIELD
[0001] The present disclosure relates to the field of immersive technology. In particular, the present disclosure relates to a device and a method for dynamic ray casting to enable real-time adaptation in a virtual environment.

BACKGROUND
[0002] Ray casting is a virtual method used to position an object in an augmented reality (AR) environment. The virtual method returns intersection points on planes/surfaces in the direction of (the ray’s) propagation with planes, surfaces, and points associated with the intersection points serving as virtual counterparts to real floors and walls calculated, The planes and surfaces are determined using through various sensors such as depth sensors and cameras, including the search for plane fitting. Currently, virtual object placement often faces limitations in adapting to changing environments and user interactions. One significant drawback is the static nature of ray casting, where the rays are cast without considering the dynamic aspects of an angle or tilt of a user device. In typical implementations, the ray is cast from a centre of the screen.
[0003] In static ray casting, a system struggles to accommodate real-time changes in a device angle which leads to inaccuracies in realistic placement, such as when placing 3 dimensional (3D) objects representing a refrigerator being be placed on a wall rather than the floor when the device is perpendicular to the floor. For example, if a user tilts the user device, the static ray casting is not adequately adjusted which causes misalignment results in between virtual and real-world elements. This lack of adaptability may diminish an overall quality of augmented reality (AR)/mixed reality (MR).
[0004] Therefore, there is a need to address the drawbacks mentioned above and any other shortcomings, or at the very least, provide a valuable alternative to the existing methods and systems.

OBJECTS OF THE PRESENT DISCLOSURE
[0005] A general object of the present disclosure relates to an efficient and a reliable system and method that obviates the above-mentioned limitations of existing systems and methods in an efficient manner.
[0006] An object of the present disclosure relates to a device and a method for selection of an origin point for a extending a ray therefrom.
[0007] Another object of the present disclosure relates to a device and a method for selection of an angle at which the ray is to be extended from the electronic device.
[0008] Another object of the present disclosure relates to a device and a method for determining a hit point to place a three-dimensional (3D) object based on an angle of the electronic device, thereby enhancing an accuracy and reliability through the 3D reconstruction process.

SUMMARY
[0009] Aspects of the present disclosure relate to the field of immersive technology. In particular, the present disclosure relates to a device and a method for dynamic ray casting to enable real-time adaptation in a virtual environment.
[0010] In an aspect, a method for extending a ray to determine a hit point. The method, which uses one or more processors, includes generating a three-dimensional (3D) reconstruction of an environment using one or more sensors associated with an electronic device, determining an angle of the electronic device with respect to a reference surface on the 3D reconstruction using the one or more sensors, and determining an interpolation value based on the angle. Further, the method includes determining at least one of an origin point or a deviation value based on the interpolation value, extending the ray to intersect with opposing surface on the 3D reconstruction based on at least one of the origin point or the deviation value, and determining an intersection of the ray with the opposing surface as the hit point.
[0011] In some embodiments, for extending the ray, the method may include at least one of: extending, by the one or more processors, the ray from the origin point, or extending, by the one or more processors, the ray at a deviation angle equal to the deviation value with respect to the electronic device.
[0012] In some embodiments, for determining the origin point, the method may include determining an upper bound value and a lower bound value based on the geometry of the electronic device. The method further includes determining the interpolation value as a linear interpolation of the upper bound value, the lower bound value, and the angle, and determining, by the one or more processors, the origin point for the ray based on the interpolation value.
[0013] In some embodiments, for determining the upper bound value and the lower bound value, the method may include determining the upper bound value and the lower bound value as a proportion of a height dimension associated with the electronic device.
[0014] In some embodiments, for determining the deviation angle, the method may include, determining the interpolation value as a linear interpolation of a predetermined ceiling value, a predetermined floor value, and the angle, and determining the deviation value for the ray based on the interpolation value.
[0015] In another aspect, an electronic device for extending a ray to determine a hit-point includes one or more processors, and a memory coupled to the one or more processors, where the memory may include processor-executable instructions. Execution of the processor-executable instructions cause the one or more processors to generate a three-dimensional (3D) reconstruction of an environment using one or more sensors associated with the electronic device, determine an angle of the electronic device with respect to a reference surface on the 3D reconstruction using the one or more sensors, and determine an interpolation value based on the angle. The processors are further configured to determine at least one of an origin point or a deviation value based on the interpolation value, extend the ray to intersect with opposing surface on the 3D reconstruction based on at least one of the origin point or the deviation value, and determine an intersection of the ray with the opposing surface as the hit point.
[0016] Various objects, features, aspects, and advantages of the inventive subject matter will become more apparent from the following detailed description of preferred embodiments, along with the accompanying drawing figures in which like numerals represent components.

BRIEF DESCRIPTION OF THE DRAWINGS
[0017] The accompanying drawings are included to provide a further understanding of the present disclosure and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the present disclosure and, together with the description, serve to explain the principles of the present disclosure.
[0018] FIG. 1 illustrates an example block diagram of an electronic device for selection of an origin point for a ray extended to determine a hit-point, in accordance with embodiments of the present disclosure.
[0019] FIGs. 2A-2H illustrate example representations for selection of the origin point for the ray extended to determine the hit-point, in accordance with embodiments of the present disclosure.
[0020] FIG. 2I illustrates an example representation for determining an interpolation value, in accordance with embodiments of the present disclosure.
[0021] FIGs. 2J-2K illustrate example representations of rays extended from the geometry at different deviation angles, in accordance with embodiments of the present disclosure.
[0022] FIGs. 3A-3B illustrate example representations of dynamic ray casting, in accordance with embodiments of the present disclosure.
[0023] FIG. 4 illustrates a flow chart of an example method for selection of the origin point for the ray extended to determine the hit-point, in accordance with embodiments of the present disclosure.
[0024] FIG. 5 illustrates an exemplary computer system in which or with which embodiments of the present disclosure may be implemented.

DETAILED DESCRIPTION
[0025] The following is a detailed description of embodiments of the disclosure depicted in the accompanying drawings. The embodiments are in such detail as to clearly communicate the disclosure. However, the amount of detail offered is not intended to limit the anticipated variations of embodiments; on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosures as defined by the appended claims.
[0026] Embodiments explained herein relate to immersive technology. In particular, the present disclosure relates to a device and a method for dynamic ray casting to enable real-time adaptation in a virtual environment. Various embodiments with respect to the present disclosure will be explained in detail with reference to FIGs. 1-5
[0027] FIG. 1 illustrates an example block diagram 100 of an electronic device 102 for selection of an origin point for a ray extended to determine a hit-point, in accordance with embodiments of the present disclosure.
[0028] Referring to FIG. 1, the electronic device 102 may include one or more processors 104, a memory 106, and an interface(s) 108. The one or more processors 104 may be implemented as one or more microprocessors, microcomputers, microcontrollers, digital signal processors, central processing units, logic circuitries, and/or any devices that manipulate data based on operational instructions. Among other capabilities, the one or more processor(s) 104 may be configured to fetch and execute computer-readable instructions stored in the memory 106 of the electronic device 102. The memory 106 may store one or more computer-readable instructions or routines, which may be fetched and executed to select the origin point for the ray extended to determine the hit-point. The memory 106 may include any non-transitory storage device including, for example, volatile memory such as Random-Access Memory (RAM), or non-volatile memory such as Erasable Programmable Read-Only Memory (EPROM), flash memory, and the like.
[0029] The interface(s) 108 may comprise a variety of interfaces, for example, a variety of interfaces, for example, interfaces for data input and output devices, referred to as I/O devices, storage devices, and the like. The interface(s) 108 may facilitate communication of the electronic device 102 with various devices coupled to it. The interface(s) 108 may also provide a communication pathway for one or more components of the electronic device 102. Examples of such components include but are not limited to, processing engine(s) 110, sensors 112, and a database 114. The database 114 may include data that is either stored or generated as a result of functionalities implemented by any of the components of the processing engine(s) 110.
[0030] In an embodiment, the processing engine(s) 110 may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processing engine(s) 110. In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processing engine(s) 110 may be processor executable instructions stored on a non-transitory machine-readable storage medium and the hardware for the one or more processor(s) 104 may comprise a processing resource (for example, one or more processors), to execute such instructions. In the present examples, the machine-readable storage medium may store instructions that, when executed by the processing resource, implement the processing engine(s) 110. In such examples, the electronic device 102 may comprise the machine-readable storage medium storing the instructions and the processing resource to execute the instructions, or the machine-readable storage medium may be separate but accessible to the electronic device 102 and the processing resource. In other examples, the processing engine(s) 110 may be implemented by an electronic circuitry. The processing engine(s) 110 may include a Three-Dimension (3D) generation module 116, an angle determination module 118, a ray extension module 120, a hit point determination module 122, an interpolation module 124, and other module(s) 126. The other module(s) 126 may implement functionalities that supplement applications/functions performed by the processing engine(s) 110.
[0031] In some embodiments, the electronic device 102 may be configured to allow users to place 3D objects in a mixed reality, or an augmented reality environment (collectively referred to as a virtual environment). For a user to interact with a virtual environment, the electronic device 102 may capture images of the physical environment around the electronic device 102 using the sensors 112. The sensors 112 of the electronic device 102 may capture the images and send the captured images to the processing engine(s) 110. In some embodiments, the sensors 112 may include, but not limited to, cameras, image sensors, optical sensors, and the like.
[0032] In an embodiment, once the processing engine(s) 110 receives the captured images from the sensors 112, the 3D generation module 116 may generate a 3D reconstruction of the physical environment and transmit information of the 3D reconstruction to the angle determination module 118. In some embodiments, the angle determination module 118 may determine an angle of the electronic device 102 with respect to a reference surface on the 3D reconstructions using the sensors 112. In an embodiment, the sensors 112 may include, but not limited to accelerometer sensors, gyroscope sensors, motion sensors, gravity sensors, and the like. The sensors 112 may provide a tilt angle with respect to y axis. In some embodiments, the interpolation module 124 may be configured to determine an interpolation value based on the angle. In some embodiments, the interpolation module 124 may determine the interpolation value using linear interpolation of the angle, along with one or more parameters as described subsequently.
[0033] In some embodiments, the ray extension module 120 may be configured to determine any one or combination of an origin point (such as origin point 206 of FIGs. 2A-2K), and a deviation value (such as deviation angles D1 and D2 of FIGs. 2J and 2K) based on the interpolation value. In some embodiments, the ray extension module 120 may be configured to extend a ray (such as ray 208 of FIGs. 2A-2K) based on at least one of the origin point 206 or the deviation angle. In some embodiments, the ray 208 to intersect with an opposing surface on the 3D reconstruction. In some examples, the opposing surface may correspond to a surface indicative of a wall, a floor, a structure such as a cupboard, table, etc. In some embodiments, the ray extension module 120 may be configured to extend the ray 208 from the origin point 206. In some embodiments, the ray extension module 124 may be configured to extend the ray 208 at a deviation angle equal to the deviation value with respect to the electronic device 102.
[0034] In some embodiments, the ray extension module 120 may be configured to determine the origin point 206 based on the interpolation value. The origin point 206 may be indicative of the origination point for the ray 208. The origin point 206 may be indicative of a point on the geometry of the electronic device 102. In some embodiments, the origin point 206 may be a point of an object representing the electronic device 102 in the 3D reconstruction. In some embodiments, the ray extension module 120 may determine an upper bound value and a lower bound value based on a geometry of the electronic device 102. In some embodiments, the ray extension module 120 may determine the upper bound value and the lower bound value as a proportion of a height dimension associated with the electronic device 102. The upper bound and the lower bound may define the range for the origin point 206. In such embodiments, the interpolation module 124 may determine the interpolation value as a linear interpolation of the upper bound value, the lower bound value, and the angle. In an embodiment, once the ray 208 is extended, the hit point determination module 122 may be configured to determine the hit point based on an intersection of the ray 208 with the opposing surface of the 3D reconstruction.
[0035] In an embodiment, the ray extension module 120 may normalize the angle to a value between predefined ranges to determine the origin point. For example, the value may be varied between the predefined ranges based on the geometry and the angle of the electronic device 102 to determine the origin point between the predefined ranges. In some examples, the angle may be normalized to a value between 0 and 1, however it may be appreciated by those skilled in the art that the predefined ranges may be suitably adapted based on requirements.
[0036] In other embodiments, the ray extension module 120 may be configured to determine the deviation value based on the interpolation value. The deviation value may correspond to the deviation angle with which the ray 208 may be extended from the origin point 206. In some embodiments, the origin 206 point 206 may be dynamically determined, as explained above. In other embodiments, the origin point 206 may be predetermined/fixed, such as at the centre of the electronic device 102, for example. In some embodiments, interpolation module 124 may be configured to determine the interpolation value as a linear interpolation of a predetermined ceiling value, a predetermined floor value, and the angle. The predetermined ceiling value and the predetermined floor value may be indicative of an upper bound and a lower bound that the deviation value may have, respectively.
[0037] FIGs. 2A-2H illustrate example representations 200A, 20B, 200C, 200D, 200E, 200F, 200G, 200H, 200I, 200J, and 200K for extending a ray 208 to determine a hit-point (such as a hit point 210), in accordance with embodiments of the present disclosure. FIG. 2I illustrates an example representation 200I for determining the interpolation value, in accordance with embodiments of the present disclosure.
[0038] Referring to FIG. 2I, an electronic device 102 may tilt in a specific angle ß, where ß is the angle between the electronic device 102 and a reference surface (e.g., ground 202 or a wall 204). The reference surface may be a plane parallel or perpendicular to the ground 202, based on the implementation. The angle ß may be determined using sensors (e.g., 112). In an embodiment, optimal object placement positions may be identified regardless of the orientation of the electronic device 102 using the sensors (e.g., 112). To enhance system efficiency and minimize battery consumption, the sensors’ data may be retrieved and perform hit tests at a frequency of 30 Hz. In FIG. 2I and the forthcoming examples, the angle ß may be determined with respect to a plane 212 that is perpendicular to the ground 202.
[0039] FIGs. 2B, 2D, 2F, and 2H indicate the upper bound, the lower bound, and the shifting of the origin point 206 therebetween as the angle ß changes. In some embodiments, the upper bound value and the lower bound value may indicate points on a geometry of the electronic device 102, such as the front screen thereof. In such instances, the upper bound value and the lower bound value may be determined as proportions of a height dimension associated with the electronic device 102. In an embodiment, an angle of the electronic device 102 may normalize to a value between predefined ranges. For example, considering a positive constant value as the height dimension, with the top of the electronic device 102 having a y-coordinate of 0 and the bottom having a y-coordinate equal to the positive constant value, the upper bound value may be 0.5 times the height dimension, while the lower bound value may be 0.9 times the height dimension. The electronic device 102 may be configured to dynamically determine the origin point 206 as a point between the upper bound and the lower bound based on the angle using the interpolation value.
[0040] Referring to FIGs. 2A-2H, for example, when a user holds an electronic device 102 in an angle (e.g., from about 10 degrees to about 90 degrees) with respect to a reference surface, the origin point 206 from which the ray 208 extends may be determined dynamically with respect to the angle. In some examples, the electronic device 102 may be configured to smoothly transition the position of the origin point 206 from the upper bound to the lower bound based on the angle ß. The electronic device 102 may be configured to move the origin point 206 as the angle ß changes within a predetermined range (such as between about 10 degrees to about 90 degrees with respect to the plane 212, for example). However, the range may be suitably modified based on requirements. Referring to FIG. 2A, in some examples, when the user tilts the electronic device 102 parallel to the plane 212 (where angle ß is equal to about 80 degrees), the origin point 206 may be at the upper bound as shown in FIG. 2B.
[0041] When the electronic device 102 is tilted to a position as shown in FIG. 2C where the angle ß is equal to about 60 degrees, then the origin point 206 may be moved towards the lower bound, as shown in FIG. 2D. In such examples, the electronic device 102 may be configured to move the origin point 206 to a point that is about 0.6 times the height dimension of the electronic device 102.
[0042] When the electronic device 102 is tilted to a position as shown in FIG. 2E where the angle ß is equal to 30 degrees, then the origin point 206 may be moved further towards the lower bound, as shown in FIG. 2F. In such examples, the electronic device 102 may be configured to move the origin point 206 to a point that is 0.8 times the height dimension of the electronic device 102.
[0043] Referring to FIG. 2G, in some scenarios, when the user tilts the electronic device 102 such that the angle ß is equal to 10 degrees with respect to the plane 212 or lower, the origin point 206 may be at the lower bound as illustrated in FIG. 2H.
[0044] When the electronic device 102 is moved from a position shown in FIG. 2G to a position shown in FIGs. 2E, 2C, or 2A, the origin point 206 may be moved towards the upper bound as illustrated in FIGs. 2H, 2F, 2D and 2B, respectively.
[0045] As shown, the ray 208 may start from a point 206 and intersect on a reference surface. In some embodiments, the reference surface may be either be wall 204, and ground 202. In some examples, the ray 208 hitting the ground 202 may persist when the electronic device 102 is inclined at the angle relative to x-axis corresponding to the ground 202. In such examples, the ray 208 hitting the reference surface may be determined as the hit point 210. In some examples, the hit point 210 may be used to place a virtual object in a mixed reality or augmented reality application, such as for placing holograms of furniture or interior designs in a room. In other examples, the hit point 210 may be used in a first person shooting video game for shooting bullets on targets.
[0046] In existing methods, the hit test point may remain static, typically at the centre of the screen. However, in the present disclosure, the hit test point may not be static, and instead, the hit point test may be dynamic and the hit point test may adjust based on a tilt angle of the electronic device 102.
[0047] In certain embodiments, the upper bound value and the lower bound value may represent coordinate values forming an origin of the ray casting locus. The y-coordinate is determined as a proportion of the height dimension, while the x-coordinate is determined as a proportion of the width dimension of the electronic device 102. In this context, the leftmost portion of the electronic device 102 corresponds to an x-coordinate of 0, and the rightmost portion has an x-coordinate equal to the width of the electronic device 102. In specific examples, both the upper and lower bounds have an x-coordinate equal to half of the width dimension represented as w/2 in Table 1. In some embodiments, the upper bound value and the lower bound value may be varied corresponding to the geometry of the electronic device 102 in relation to a reference surface for positioning a virtual object. For example, the upper bound value is greater than the lower bound value, if the virtual object may intend to be positioned on a reference surface (e.g., ground). Similarly, the lower bound value is greater than the upper bound value, if the virtual object may intend to be positioned on the reference surface (e.g., wall).
Value of ß (electronic device tilt angle) Value of x coordinate Value of y coordinate
Less than 0 w/2 M * h
Greater than S w/2 R * h
Between 0 to S w/2 lerp(M * h, R * h, ß/ D)
Table. 1

[0048] In an embodiment, for example, where R is a constant value of 0.5, M is a constant value of 0.9, D is a constant value of 35, and S is an angle with a constant value of 34. The origin point may be determined as a linear interpolation of the upper bound value, the lower bound value, and the angle. The interpolation value may be determined as a linear interpolation function of the upper bound, the lower bound, and the angle (ß). The linear interpolation function may be given by:
,
- where ‘a’ is the upper bound,
- ‘b’ is the lower bound, and
- ‘t’ is the interpolation parameter which is determined by normalizing the angle (ß or S) to a value between from 0 to 1, by dividing the angle by a predetermined constant ‘D’.
[0049] The linear interpolation function may provide the interpolation value that allows the origin point 206 to be determined as a fraction of the distance between the upper bound (a) and the lower bound (b). Using interpolation functions may allow the origin point 206 to be smoothly transitioned to a point between the upper bound and the lower bound. Further, the interpolation function allows the transition of the origin point 206 from upper bound towards the lower bound when the electronic device 102 is shifted from 0 degrees to 10 degrees to be greater the transition of the origin point 206 towards the lower bound when the electronic device 102 is shifted from 80 degrees to 90 degrees. In such examples, interpolation functions may allow the origin point 206 to allow the origin point 206 to move in an interpolated manner corresponding to the tilt angle. However, it may be appreciated by those skilled in the art that the interpolation value may also be determined using other interpolation functions, such as polynomial interpolation function, exponential interpolation function, and the like, but not limited thereto.
[0050] FIGs. 2J-2K illustrate example representations 200J and 200K of the ray 208 extended at different deviation angles, in accordance with embodiments of the present disclosure.
[0051] In some embodiments, instead of shifting the position of the origin point 206 based on the tilt angle of the electronic device 102, the deviation value may be determined. The deviation value (such as D1 and D2 shown in FIGs. 2J and 2K respectively) may be indicative of the (deviation) angle at which the ray 208 extended deviates from the electronic device 102. In such examples, the origin point 206 may be fixed, such as at a centre point of a screen. The electronic device 102 may dynamically determine the deviation value for extending the ray 208 based on the angle. In some embodiments, the deviation value may be determined using the interpolation value. In such embodiments, the deviation value may be determined as a linear interpolation of the predetermined ceiling value, the predetermined floor value, and the angle. In some examples, the predetermined ceiling value may correspond to variable ‘a’ and the predetermined floor value may correspond to variable ‘b’, and the normalized angle may correspond to variable ‘t’ in the linear interpolation function . The lerp function may yield the interpolation value, which may be indicative of a fraction/percentage/ or a value which may be used to determine the deviation value.
[0052] Referring to FIG. 3A, when a user holds the electronic device 102 in a certain angle, for example, at 80 degrees with respect x-axis or the ground 202, the deviation angle (corresponding to the deviation value) may be D1. Referring to FIG. 3B, when a user holds the electronic device 102 in a certain angle, for example, at 10 degrees with respect x-axis or the ground 202, the deviation angle (corresponding to the deviation value) may be D2. As shown, D2 may be greater than D1. D2 may be closer to the predetermined ceiling value than D1, and D1 may be closer to the predetermined floor value than D2. Using an interpolation function to determine the deviation value to be a value between the predetermined floor value and the predetermined ceiling value. Further, using linear interpolation function may allow the deviation value to smoothly transition between the predetermined floor value and the predetermined ceiling value as the tilt angle of the electronic device 102 is shifted.
[0053] The ray 208 may be extended at the determined origin point 206, or at the deviation value, or both. The extended ray 208 may intersect with opposing surfaces in the 3D reconstruction. In some examples, the opposing surfaces may be virtual surfaces in the 3D reconstruction which correspond to physical surfaces in the environment around the electronic device 102. In some examples, the opposing surfaces may be indicative of virtual counterparts of a floor/ground, wall, tables, cupboards, furniture, stools, stairs, windows, persons, and the like. The intersections of the ray 208 with the opposing surfaces may be determined to be the hit point 210. By allowing either the origin point 206 or the deviation value to be determined dynamically based on the tilt angle of the electronic device 102, the electronic device 102 may predictably determine the hit point 210. For example, by moving the origin point 206 or changing the deviation angle of the ray 208 corresponding to the tilt angle of the electronic device 102, it may be ensured that the ray 208 intersects with a target surface (such as the reference surface or the ground 202). In some examples, ensuring that the hit points 210 are determined at the target surface may allow the electronic device 102 to provide intended operations with increased predictability, such as placing a virtual object (e.g. a refrigerator) perpendicular to the ground 202 instead of the wall. Hence, the electronic device 102 may allow for dynamically adjusting ray casting in real-time with respect to an orientation of a user device to enhance responsiveness in dynamic scenarios. By customizing starting point (i.e. the origin point 206) of rays 208, the electronic device 102 may provide a personalized and a comfortable interaction experience based on user preferences, and contribute to more realistic and immersive user experience.
[0054] FIGs. 3A-3B illustrate example representations 300A, and 300B of dynamic ray casting, in accordance with embodiments of the present disclosure.
[0055] Referring to FIGs. 3A-3B, when a user enables an immersive reality-based application associated with an electronic device 102, a virtual box may be represented in a virtual environment to guide the user. For example, the virtual box may be represented as a Product Guide Marker (PGM). Initially, a primary PGM 302 may be generated to indicate a hit test conducted during instant placement mode that provides real-time feedback on potential object placements. Once the electronic device 102 identifies physical environment, a secondary PGM 304 may be generated to signify the hit test. Once the secondary PGM 304 is generated, a virtual dot 306 may be positioned at the centre that represents hit points 308 or a result of ray casting from an origin point on a screen into a scene i.e., the virtual environment. At this time, when the electronic device 102 is continuously tilted back and forth during interaction i.e., rays intersect with geometries, the hit points 308 may be dynamically adjusted based on changes in angle of interaction to enhance accurate and responsive virtual interactions. This dynamic adaptation of the hit points 308 may enable users to consistently point to the ground 202, regardless of sensors 112. This feature enhances user convenience by allowing seamless ground surface interactions at any holding angle of the electronic device 102.
[0056] FIG. 4 illustrates a flow chart of an example method 400 for selection of an origin point for a ray extended to determine a hit-point, in accordance with embodiments of the present disclosure.
[0057] Referring to FIG. 4, at block 402, the method 400 may include generating, by one or more processors such as processors 104 of FIG. 1, a three-dimensional (3D) reconstruction of an environment using one or more sensors associated with an electronic device, such as sensors 112 and electronic device 102 of FIG. 1.
[0058] At block 404, the method 400 may include determining, by the one or more processors, an angle of the electronic device with respect to a reference surface on the 3D reconstruction using the one or more sensors.
[0059] At block 406, the method 400 may include determining, by the one or more processors, an interpolation value based on the angle.
[0060] At block 408, the method 400 may include determining, by the one or more processors, at least one of an origin point or a deviation value based on the interpolation value.
[0061] At block 410, the method 400 may include extending, by the one or more processors, the ray to intersect with any surface on the 3D reconstruction based on at least one of the origin point or the deviation value.
[0062] At block 412, the method 400 may include determining, by the one or more processors, an intersection of the ray with the any surface as the hit point.
[0063] FIG. 5 illustrates an exemplary computer system 500 in which or with which embodiments of the present disclosure may be implemented.
[0064] As shown in FIG. 5, the computer system 500 may include an external storage device 510, a bus 520, a main memory 530, a read only memory 540, a mass storage device 550, a communication port 560, and a processor 570. A person skilled in the art will appreciate that the computer system 500 may include more than one processor and communication ports. The processor 570 may include various modules associated with embodiments of the present disclosure.
[0065] In an embodiment, the communication port 560 may be any of an RS-232 port for use with a modem-based dialup connection, a 10/100 Ethernet port, a Gigabit or 10 Gigabit port using copper or fiber, a serial port, a parallel port, or other existing or future ports. The communication port 560 may be chosen depending on a network, such a Local Area Network (LAN), Wide Area Network (WAN), or any network to which the computer system 500 connects.
[0066] In an embodiment, the memory 530 may be a Random-Access Memory (RAM), or any other dynamic storage device commonly known in the art. The read-only memory 540 may be any static storage device(s) e.g., but not limited to, a Programmable Read Only Memory (PROM) chips for storing static information e.g., start-up or Basic Input/Output system (BIOS) instructions for the processor 570.
[0067] In an embodiment, the mass storage device 550 may be any current or future mass storage solution, which can be used to store information and/or instructions. Exemplary mass storage solutions include, but are not limited to, Parallel Advanced Technology Attachment (PATA) or Serial Advanced Technology Attachment (SATA) hard disk drives or solid-state drives (internal or external, e.g., having Universal Serial Bus (USB) and/or Firewire interfaces), one or more optical discs, Redundant Array of Independent Disks (RAID) storage, e.g., an array of disks (e.g., SATA arrays).
[0068] In an embodiment, the bus 520 communicatively couples the processor(s) 570 with the other memory, storage, and communication blocks. The bus 520 may be, e.g., a Peripheral Component Interconnect (PCI)/PCI Extended (PCI-X) bus, Small Computer System Interface (SCSI), USB, or the like, for connecting expansion cards, drives, and other subsystems as well as other buses, such a front side bus (FSB), which connects the processor 570 to computer system 500.
[0069] Optionally, operator and administrative interfaces, e.g., a display, keyboard, joystick, and a cursor control device, may also be coupled to the bus 520 to support direct operator interaction with the computer system 500. Other operator and administrative interfaces may be provided through network connections connected through the communication port 560. Components described above are meant only to exemplify various possibilities. In no way should the aforementioned exemplary computer system 500 limit the scope of the present disclosure.
[0070] While the foregoing describes various embodiments of the disclosure, other and further embodiments of the disclosure may be devised without departing from the basic scope thereof. The scope of the disclosure is determined by the claims that follow. The disclosure is not limited to the described embodiments, versions, or examples, which are included to enable a person having ordinary skill in the art to make and use the disclosure when combined with information and knowledge available to the person having ordinary skill in the art.

ADVANTAGES OF THE PRESENT DISCLOSURE
[0071] The present disclosure relates to a device and a method for dynamically adjusting ray casting in real-time with respect to an orientation of a user device to enhance responsiveness in dynamic scenarios.
[0072] The present disclosure relates to a device and a method that customize a starting point of rays to provide a personalized and a comfortable interaction experience based on user preferences.
[0073] The present disclosure relates to a device and a method for contributing more realistic and immersive user experience.
[0074] The present disclosure related to a device and a method for optimally placing a 3D object in an AR environment.
, Claims:1. A method (400) for extending a ray (208) to determine a hit point (210), the method (400) comprising:
generating (402), by one or more processors (104), a three-dimensional (3D) reconstruction of an environment using one or more sensors associated with an electronic device (102);
determining (404), by the one or more processors (104), an angle of the electronic device (102) with respect to a reference surface on the 3D reconstruction using the one or more sensors;
determining (406), by the one or more processors (104), an interpolation value based on the angle;
determining (408), by the one or more processors (104), at least one of an origin point (206) or a deviation value based on the interpolation value;
extending (410), by the one or more processors (104), the ray (208) to intersect with opposing surface on the 3D reconstruction based on at least one of the origin point (206) or the deviation value; and
determining (412), by the one or more processors (104), an intersection of the ray (208) with the opposing surface as the hit point (210).

2. The method (400) as claimed in claim 1, wherein for extending (410) the ray (208), the method (400) comprises at least one of:
extending, by the one or more processors (104), the ray (208) from the origin point (206); or
extending, by the one or more processors (104), the ray (208) at a deviation angle equal to the deviation value with respect to the electronic device (102).

3. The method (400) as claimed in claim 1, wherein for determining (408) the origin point (206), the method (400) comprises:
determining, by the one or more processors (104), an upper bound value and a lower bound value based on the geometry of the electronic device (102);
determining, by the one or more processors (104), the interpolation value as a linear interpolation of the upper bound value, the lower bound value, and the angle; and
determining, by the one or more processors (104), the origin point (206) for the ray (208) based on the interpolation value.

4. The method (400) as claimed in claim 3, wherein for determining the upper bound value and the lower bound value, the method (400) comprises determining, by the one or more processors (104), the upper bound value and the lower bound value as a proportion of a height dimension associated with the electronic device (102).

5. The method (400) as claimed in claim 1, wherein for determining (408) the deviation angle, the method (400) comprises:
determining, by the one or more processors (104), the interpolation value as a linear interpolation of a predetermined ceiling value, a predetermined floor value, and the angle; and
determining, by the one or more processors (104), the deviation value for the ray (208) based on the interpolation value.

6. An electronic device (102) for extending a ray (208) to determine a hit-point, comprising:
one or more processors (104); and
a memory (106) coupled to the one or more processors (104), wherein the memory (106) comprises processor-executable instructions, which, on execution, cause the one or more processors (104) to:
generate a three-dimensional (3D) reconstruction of an environment using one or more sensors associated with the electronic device (102);
determine an angle of the electronic device (102) with respect to a reference surface on the 3D reconstruction using the one or more sensors;
determine an interpolation value based on the angle;
determine at least one of an origin point (206) or a deviation value based on the interpolation value;
extend the ray (208) to intersect with an opposing surface on the 3D reconstruction based on at least one of the origin point (206) or the deviation value; and
determine an intersection of the ray (208) with the opposing surface as the hit point.

7. The electronic device (102) as claimed in claim 6, wherein to extend the ray (208), the one or more processors (104) are configured to:
extend the ray (208) from the origin point (206); or
extend the ray (208) at a deviation angle equal to the deviation angle with respect to the electronic device (102).

8. The electronic device (102) as claimed in claim 6, wherein to determine the origin point (206), the one or more processors (104) are configured to:
determine an upper bound value and a lower bound value based on the geometry of the electronic device (102);
determine the interpolation value as a linear interpolation of the upper bound value, the lower bound value, and the angle; and
determine the origin point (206) for the ray (208) based on the interpolation value.

9. The electronic device (102) as claimed in claim 8, wherein to determine the upper bound value and the lower bound value, the one or more processors (104) are configured to determine the upper bound value and the lower bound value as a proportion of a height dimension associated with the electronic device (102).

10. The electronic device (102) as claimed in claim 6, wherein to determine the deviation angle, the one or more processors (104) are configured to:
determine the interpolation value as a linear interpolation of a predetermined ceiling value, a predetermined floor value, and the angle; and
determine the deviation angle for the ray (208) based on the interpolation value.

Documents

Application Documents

# Name Date
1 202441016872-STATEMENT OF UNDERTAKING (FORM 3) [08-03-2024(online)].pdf 2024-03-08
2 202441016872-POWER OF AUTHORITY [08-03-2024(online)].pdf 2024-03-08
3 202441016872-FORM 1 [08-03-2024(online)].pdf 2024-03-08
4 202441016872-DRAWINGS [08-03-2024(online)].pdf 2024-03-08
5 202441016872-DECLARATION OF INVENTORSHIP (FORM 5) [08-03-2024(online)].pdf 2024-03-08
6 202441016872-COMPLETE SPECIFICATION [08-03-2024(online)].pdf 2024-03-08