Abstract: ABSTRACT METHOD AND SYSTEM FOR OPTIMIZING THE DISPLAY AND PLACEMENT OF ITEMS IN AUGMENTED REALITY ENVIRONMENT A method rendering augmented reality e-commerce for optimizing the display and placement of items. At first step, a device (102) is casting a ray (104) to generate an augmented reality environment. At second step, an object (106) is analyzed to calculate the dimension to be placed in the augmented reality environment. At third step, an optimum position is calculated on the ray (104) based on the dimension of the object (106). At fourth step, the object (106) is positioned at the optimum position (108, 110, 112) in the augmented reality environment. At fifth step, height of a virtual box (114) is calculated based on the optimum position. At sixth step, the virtual box (114) is added at a bottom of the object (106). At seventh step, the object (106) position is adjusted as per user requirement. Figure 1
Description:FIELD OF THE INVENTION
[001] The present invention relates to the field of augmented reality, and in particular to a method for adjusting the position and size of three dimensional (3D) objects in augmented reality environment for enriching visibility experience.
BACKGROUND OF THE INVENTION
[002] Augmented reality (AR) is an interactive experience that combines the real world and computer-generated three dimensional (3D) content. Further, the augmented reality technology has advanced significantly, it still faces technological limitations that can affect user experience. Issues such as limited field of view (FOV), lower resolution, and latency in AR devices can detract from the immersive experience, causing disorientation or discomfort for users. These limitations also impact the accuracy and realism of AR applications, which are critical for applications in areas like education, healthcare, and professional training. As the technology continues to advance, addressing these limitations will be critical in expanding the applicability and user adoption of AR.
[003] Typically, the ray casting is an image rendering of 3D objects and scenes because a line transforms to a line. So, instead of projecting curved edges and surfaces in the 3D scene to the 2D image plane, transformed lines (rays) are intersected with the object (106)s in the scene. Further, in the ray casting the object is typically positioned at a hit point. Further, the hit point is obtained by the intersection of the extended ray and the scene. Thereby, the object by default placed at the hit point irrespective of the dimensions of the object to be placed in the augmented reality environment.
[004] Sometimes the object is too small or too large resulting hampering visibility of the user when placed on the hit point. But in typical augmented reality devices the placement of the object on the hit point is fixed and the user has no control for positioning the object at the optimum distance in order to maintain visibility and experiencing true or realistic appearance of the object. For example if the object is small like a phone, smart watch etc and being place the object at a point that is far from the camera, such as a small bottle placed 1.5m to 2 m away from the camera. The bottle appear so small that it would not add any value to user experience.
[005] At present, the augmented reality systems face limitations in effectively representing small 3D objects when placed close to the user, often resulting in suboptimal user experiences. Traditional approaches may compromise the user's line of sight or fail to convey the desired proximity effect.
[006] The prior art US10235810B2 titled “ Augmented reality e-commerce for in-store retail” discloses about the augmented reality e-commerce may maximize limited physical space within a store by providing virtual displays of products in predetermined physical spaces. A virtual shelf blueprint that includes positional information for virtual display locations on a virtual shelf may be received. Subsequently, geospatial sensor scans of multiple reference markers in a real-world may be received. The multiple reference markers may correspond to reference points in the virtual shelf blueprint. The virtual shelf may be mapped to the real-world based on the multiple reference markers, in which the mapping may be performed by assigning positional data of the multiple reference markers to the reference points in the virtual shelf blueprint. The virtual shelf may be populated with one or more 3-dimensional (3D) objects that virtually represent at least one real-world product for viewing via an augmented reality device.
[007] Another prior art US11403829B2 titled “Object (106) preview in a mixed reality environment” discloses about images or renderings of items placed (virtually) within a physical space. Further, users can view images or renderings of items placed (virtually) within a physical space. For example, a rendering of an item can be placed within a live camera view of the physical space. A snapshot of the physical space can be captured and the snapshot can be customized, shared, etc. The renderings can be represented as two-dimensional images, e.g., virtual stickers or three-dimensional models of the items. Users can have the ability to view different renderings, move those items around, and develop views of the physical space that may be desirable. The renderings can link to products offered through an electronic marketplace and those products can be consumed. Further, collaborative design is enabled through modeling the physical space and enabling users to view and move around the renderings in a virtual view of the physical space.
[008] The aforesaid prior art are not solving various challenges with respect to positioning of the object in the augmented reality environment as per the user requirement and the size of the object thereby, enhancing visibility of the object.
[009] In order to overcome the challenges associated with the state of art, there is a need for development of a method which is adaptable to position the object in the augmented reality with respect to size of the object in order to improve visibility and to provide uninterrupted line of sight.
OBJECTIVE OF THE INVENTION
[0010] The primary objective of the present invention is to provide a method for optimizing the display and placement of items in augmented reality environment.
[0011] Another objective of the present invention is to provide the method and a system for optimizing the display and placement of items in augmented reality environment.
[0012] Another objective of the present invention is to provide the method and the system for optimizing the display and placement of items in augmented reality environment for placement of three dimensional (3D) objects are displayed in a manner that optimally conveys proximity to the user, enhancing the sense of realism.
[0013] Another objective of the present invention is to provide the method and the system for optimizing the display and placement of items in augmented reality environment for uninterrupted line of sight.
[0014] Yet another objective of invention is to provide the method and the system for optimizing the display and placement of items in augmented reality for enhanced user experience.
SUMMARY OF THE INVENTION
[0015] A method rendering augmented reality e-commerce for optimizing the display and placement of items. At first step, a device cast a ray for generating an augmented reality environment. At second step, an object is analyzed to calculate the dimension to be placed in the augmented reality environment. At third step, an optimum position is calculated on the ray based on the dimension of the object. At fourth step, the object is positioned at the optimum position in the augmented reality environment. At fifth step, height of a virtual box is calculated based on the optimum position. At sixth step, the virtual box is added at a bottom of the object. At seventh step, the object position is adjusted as per user requirement. Further, the device include but not limited to mobile phone, laptop, personal computer, tablet. Further, the optimum position is a place on the ray maintaining visibility and line of sight.
[0016] Further, the method is implemented through a system comprising a device integrated with an image capturing unit for casting the ray in order to generate the augmented reality environment. Further, the device is integrated with the processor for performing one or more operations by executing an instructions. Further, the processor is integrated with the memory which is storing the instruction executed by the processor.
[0017] Further, the one or more operations performed by the processor including but not limited to calculating dimensions of the object, calculating optimum distance, from the device, for placing the object in augmented reality environment, executing an instruction for adjusting the position of the object at the optimum position on the ray and adding virtual box, based on the one or more position, at the base of the object in order to maintain visibility. Further, the optimum distance include a first distance and a second distance. Further, the first distance is closer to the device which is casting the ray. Further, the second distance is distant to the device.
[0018] Further, the method for adjusting the position of the object in the augmented reality environment, comprising the steps the object is placed at the optimum distance and adding the virtual box at the bottom of the object. Further, evaluating visibility of the object from the device. Further, the object is adjusted at the first distance when the placed object is smaller than required and move the object at the second distance when the object is larger than the required.
BRIEF DESCRIPTION OF DRAWINGS
[0019] The present invention will be better understood after reading the following detailed description of the presently preferred aspects with reference to the appended drawings:
[0020] Figure 1 illustrates a flow chart for a method rendering augmented reality e-commerce for optimizing the display and placement of items;
[0021] Figure 2 illustrates the object at an optimum position for placing an object on the ray;
[0022] Figure 3 illustrates the schematic for calculating an optimum distance using field of view of the image capturing device;
[0023] Figure 4 illustrates pictorial representation of the object at a first position in the augmented reality environment;
[0024] Figure 5 illustrates pictorial representation of the object at a second position in the augmented reality environment; and
[0025] Figure 6 illustrates pictorial representation of the object at a hit point in the augmented reality environment.
DETAILED DESCRIPTION OF THE INVENTION
[0026] The following description describes various features and functions of the disclosed apparatus. The illustrative aspects described herein are not meant to be limiting. It may be readily understood that certain aspects of the disclosed apparatus can be arranged and combined in a wide variety of different configurations, all of which are contemplated herein.
[0027] The following description of preferred embodiments of the invention is not intended to limit the invention to these preferred embodiments, but rather to enable any person skilled in the art to make and use this invention.
[0028] These and other features and advantages of the present invention may be incorporated into certain embodiments of the invention and will become more fully apparent from the following description as set forth hereinafter.
[0029] Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope of the invention. In addition, descriptions of well-known functions and constructions are omitted for clarity and conciseness.
[0030] The terms and words used in the following description and claims are not limited to the bibliographical meanings, but, are merely used to enable a clear and consistent understanding of the invention. Accordingly, it should be apparent to those skilled in the art that the following description of exemplary embodiments of the present invention are provided for illustration purpose only and not for the purpose of limiting the invention.
[0031] It is to be understood that the singular forms “a,” “an,” and “the” include plural referents unless the context clearly dictates otherwise.
[0032] It should be emphasized that the term “comprises/comprising” when used in this specification is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps, components or groups thereof.
[0033] Accordingly, the present invention relates to an augmented reality (AR) system and method for enhancing user experience during the placement of three dimensional (3D) object (106). The augmented reality (AR) system and method is capable of optimizing the display of virtual object (106) in an augmented reality (AR) environments, specifically focusing on object (106) size and distance to the user.
[0034] The present invention introduces a novel method for determining the optimal placement of three dimensional (3D) objects in the augmented reality, taking into account both the size and distance of the object relative to the user. When the size and distance fall below a predefined threshold, the system dynamically selects a stool-based placement strategy. This strategy involves situating the virtual object on a virtual stool, aligning it with the user's line of sight while maintaining the perception of proximity.
[0035] In an embodiment, the present invention relates to a method and system for optimizing the display and placement of items in augmented reality environment. Further, the system comprises a device integrated with an image capturing unit for casting the ray (104) in order to generate the augmented reality environment. A processor integrated with the device for performing one or more operations by executing an instructions. A memory unit, integrated with the processor, storing the instruction executed by the processor.
[0036] Further, the augmented reality system employs a sophisticated algorithm that assesses the dimensions of the virtual object and its proximity to the user. If the calculated parameters meet the predefined criteria, the system triggers the stool-based placement mechanism. The virtual stool is strategically positioned to ensure that the 3D object appears close to the user, creating a realistic and immersive visual experience without obstructing the user's line of sight.
[0037] In an embodiment, the processor may be implemented as a combination of hardware and programming (for example, programmable instructions) to implement one or more functionalities of the processor. In examples described herein, such combinations of hardware and programming may be implemented in several different ways. For example, the programming for the processor may be processor executable instructions stored in the memory and the hardware for the processor may comprise a processing resource (for example, one or more processors), to execute such instructions.
[0038] Figure 1 illustrates a flow chart for a method (100) rendering augmented reality e-commerce for optimizing the display and placement of items, comprising following steps-
[0039] At first step, an augmented reality environment is established by casting a ray (104) from an origin point.
[0040] At second step, an object (106) is analyzed, by the processor, for calculating dimension of the object (106).
[0041] At third step, an optimum position (108, 110, 112) is calculated for positioning the object (106) on the ray (104).
[0042] At fourth step, the object (106) is placed at the optimum position (108, 110, 112).
[0043] At fifth step, height of a virtual box (114) is calculated with respect to the placement of the object (106) on the ray (104).
[0044] At sixth step, the virtual box (114) is added at a base of the object (106).
[0045] At seventh step, the position of the object (106) is adjusted as per the user requirement.
[0046] In an embodiment, the device is placed at a position where the augmented reality environment required to be developed. Further, the processor activates the image capturing unit on command of the user. Further, the image capturing unit extend the ray (104) in order to generate the augmented reality environment. Further, the ray (104) is originated from an origin point. Further, the origin point is selected by one or more ways. Further, the one or more ways may include but not limited to user selected, dynamic selected and static selected. Further, the dynamic selected and static selected origin point is selected by the processor. Further, the processor execute the instruction for the origin point selection.
[0047] Further, the processor analyses dimension of the object (106) to be place in the augmented reality environment. Further, based on the dimension of the object (106) the processor positioned the object (106) in order to maintain visibility and line of sight from the user. Further, the processor placed the object (106) at optimum location from the user in the augmented reality environment. Further, the position of the object (106) may be adjusted but the size of the object (106) may be reamined same as original size. Further, the device include but not limited to mobile phone, laptop, personal computers, tablet. Further, the object (106) may be extensively placed around the arc, as shown in figure 2.
[0048] Further, the virtual box (114) is placed at the bottom of the object (106) to eradicate the hanging object (106) experience of the user. Further, the virtual box (114) may include but not limited to a square, rectangular, cylindrical and triangular shape. Further, the shape and size of the virtual box (114) may be adjusted to fulfill the purpose of the cross selling. Further, the size of the virtual box (114) may adjusted as per the object height from the base surface. Further, the base surface may include but not limited to a wall and a ground.
[0049] Further, the processor estimate the dimension of the object (106) to be placed in the augmented reality environment. Further, the processor calculate the optimum distance for placing the object (106) in the augmented reality environment by utilizing field of view of the image capturing unit, as shown in figure 3. Further, the processor calculates the optimum distance distance using equation (1) and equation (2).
val maxOf = Diagonal_width / 2
val minimum Distance = max Of / tan(Math.to Radians (camera Horizonal Fov))
[0050] Figure 4 illustrates the object (106) placed at the optimum position including the first position (108). Further, the first position (108) may be an ideal position.
[0051] Further, the object (106) is placed at the optimum position including a first position (108), a second position (110) and a third position (112) based on the optimum distance including a first distance calculated through the equation (1) and (2). Further, the processor calculates the height of the virtual box (114) to be placed below the object (106). Further, the processor calculates the height of the virtual box (114) with respect to the optimum position including a first position (108) of the object (106) which is placed in the augmented reality environment. Further, the user may adjust the position of the object (106), on the ray (104), as per the requirement.
[0052] Figure 5 illustrates the object (106) placed at the second position (110) in the augmented reality environment. Further, the second position (110) is closer to the device (202). Further, the user may commands the processor to move the object (106) in the second position (110) in case the user wants to see the object (106) closer than the first position (108). Further, the processor repositioned the object (106) to the second position (110). Further, the processor calculates the height of the virtual box (114) as per the second position (110). Further, the height of the virtual box (114) increased than the virtual box (114) placed at the first and the third position (108, 112).
[0053] Further, the virtual box (114) is added at the base of the object (106) positioned at the second position (110). Further, the dimension of the object (106) does not change and remains as the original dimension thereby, enhancing user’s experience.
[0054] Figure 6 illustrates the object (106) placed at the third position (112) in the augmented reality environment. Further, the third position (112) is farther to the device (202). Further, the user commands the processor to move the object (106) in the third position (112). Further, the object (106) is placed on the third position (112) when the user wants to see the object (106) farther from the first position (108). Further, the processor repositioned the object (106) to the third position (112). Further, the processor calculates the height of the virtual box (114) as per the third position (112). Further, the height of the virtual box (114) decreased to keep the object (106) scale intact and at the line of sight.
[0055] Also, in figure 5 and figure 6 in combination illustrates the user actions involve the observation of the product from varying distances, rather than the ideal distance. For instance, in figure 4, although the object is positioned at the optimal distance, users may still desire to assess the product from closer or farther perspectives. Further, the user may explore adjustments to accommodate proximity and distance along the ray (104). While the size of the object remains constant, the height of the stool may be adjusted relative to the distance from the ground and the object's position along the axis and/or ray.
[0056] Further, the virtual box (114) is added at the base of the object (106) positioned at the third position (112). Further, the dimension of the object (106) does not change and remains as the original dimension thereby, enhancing user’s experience.
[0057] Further, the processor compare the calculated distance with a distance of the hit point. In case, the calculated distance is greater than the distance of the hit point than the processor placed the object (106) at the hit point, as shown in figure 7. Further, the ray (104) is extended to intersect with opposing surface on the 3D reconstruction and intersect with the opposing surface. Further, the intersection point is known as the hit point. As an extended functionality, as per the user analysis and adopt dynamically the way FSN/SKU’s or objects are placed that are of smaller size.
[0058] ADVANTAGES OF THE INVENTION:
• The present invention disclosed the system that ensures that small three dimensional (3D) objects are displayed in a manner that optimally conveys proximity to the user, enhancing the sense of realism.
• The present invention utilizing a virtual stool for placement.
• The present invention mitigates issues associated with obstructed views, providing users with an uninterrupted line of sight to the augmented content.
• The innovative approach to object placement contributes to a more engaging and immersive augmented reality experience, making the virtual content feel seamlessly integrated into the user's physical environment.
• The present invention provides flexibility to the user for placement of the object in the augmented reality environment.
[0059] Although the embodiments herein are described with various specific embodiments, it will be obvious for a person skilled in the art to practice the invention with modifications. However, all such modifications are deemed to be within the scope of the invention.
, Claims:1. A method (100) rendering augmented reality e-commerce for optimizing the display and placement of items, comprising:
Step 1: casting a ray (104), from an origin point, for generating an augmented reality environment;
Step 2: analyzing dimension of an object (106) to be placed in the augmented reality environment;
Step 3: calculating an optimum position (108,110, 112) on the ray (104) based on the dimension of the object (106);
Step 4: positioning the object (106) at the optimum position in the augmented reality environment;
Step 5: calculating height of a virtual box (114) based on the optimum position (108,110, 112);
Step 6: adding the virtual box (114) at a bottom of the object (106); and
Step 7: adjusting position of the object (106) as per user requirement.
2. The method (100) as claimed in claim 1, wherein the origin point is selected through one or more ways.
3. The method (100) as claimed in claim 1, wherein the one or more ways includes but not limited to user selected, dynamically selected and statically selected.
4. The method (100) as claimed in claim 1, wherein the optimum position (108,110, 112) is a place on the ray (104) maintaining visibility and line of sight.
5. The method (100) as claimed in claim 1, wherein the method (100) is implemented through a system, comprising:
I. a device (102) integrated with an image capturing unit for casting the ray (104) in order to generate the augmented reality environment;
II. a processor integrated with the device for performing one or more operations by executing an instructions; and
III. a memory unit, integrated with the processor, storing the instruction executed by the processor.
6. The system as claimed in claim 5, wherein the device include but not limited to mobile phone, laptop, personal computer, tablet.
7. The method (100) as claimed in claim 1, wherein the one or more operations performed by the processor, comprising:
I. calculating dimensions of the object (106);
II. calculating optimum distance, from the device, for placing the object (106) in augmented reality environment;
III. executing an instruction for adjusting the position of the object (106) at the optimum position on the ray (104); and
IV. adding the virtual box (114), based on the optimum position including a first position (108), a second position (110) and a third position (112), at the base of the object (106) in order to maintain visibility.
8. The method as claimed in claim 4, wherein the optimum distance including a first distance and a second distance.
9. The method as claimed in claim 4, wherein the first distance is closer to the device (102) and the second distance is distant to the device (102).
10. The method as claimed in claim 4, wherein the method for adjusting the position of the object (106) in the augmented reality environment, comprising:
I. placing the object (106) at the optimum distance and adding the virtual box (114) at the bottom of the object (106);
II. evaluating visibility of the object (106) from the device (116); and
III. moving the object (106) at the optimum distance, wherein the object (106) is moved to the first distance when the placed object (106) is smaller than required and at the second distance when the object (106) is larger than the required.
| # | Name | Date |
|---|---|---|
| 1 | 202441041794-STATEMENT OF UNDERTAKING (FORM 3) [29-05-2024(online)].pdf | 2024-05-29 |
| 2 | 202441041794-REQUEST FOR EXAMINATION (FORM-18) [29-05-2024(online)].pdf | 2024-05-29 |
| 3 | 202441041794-REQUEST FOR EARLY PUBLICATION(FORM-9) [29-05-2024(online)].pdf | 2024-05-29 |
| 4 | 202441041794-PROOF OF RIGHT [29-05-2024(online)].pdf | 2024-05-29 |
| 5 | 202441041794-POWER OF AUTHORITY [29-05-2024(online)].pdf | 2024-05-29 |
| 6 | 202441041794-FORM-9 [29-05-2024(online)].pdf | 2024-05-29 |
| 7 | 202441041794-FORM 18 [29-05-2024(online)].pdf | 2024-05-29 |
| 8 | 202441041794-FORM 1 [29-05-2024(online)].pdf | 2024-05-29 |
| 9 | 202441041794-DRAWINGS [29-05-2024(online)].pdf | 2024-05-29 |
| 10 | 202441041794-DECLARATION OF INVENTORSHIP (FORM 5) [29-05-2024(online)].pdf | 2024-05-29 |
| 11 | 202441041794-COMPLETE SPECIFICATION [29-05-2024(online)].pdf | 2024-05-29 |