Sign In to Follow Application
View All Documents & Correspondence

System And Method For Determining Area Of Wall With Respect To Its Visualization Preferences

Abstract: A method (100) and system (200) of determining area of a wall with respect to its visualization preferences, comprising: an input module (202) to capture a plurality of images of rooms; a pre-processing unit (210) to scan the received input images of the rooms and remove unwanted frames from the received input images of the rooms, wherein a feature extraction unit (212) extract frames of the objects of walls from the scanned images; a processing unit (214) for calculating an area of the extracted frames and highlighting extracted frames of the objects of the wall for hiding and visualizing each extracted frame of the object of wall of the room with and without extracted frames of the actual wall; and a controller (216) to provide wall area calculation based on design simulations as per visualization preferences set up.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
16 December 2023
Publication Number
02/2024
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
Parent Application
Patent Number
Legal Status
Grant Date
2024-07-12

Applicants

SKYTRIBE TECHNOLOGIES PVT LTD
75, 4TH MAIN, RAMAMURTHYNAGAR, HOYSALA NAGAR, DOORAVANI NAGAR POST, BANGALORE 560016, KARNATAKA

Inventors

1. JAY SAH
75, 4TH MAIN, RAMAMURTHYNAGAR, HOYSALA NAGAR, DOORAVANI NAGAR POST, BANGALORE 560016, KARNATAKA

Specification

Description:FIELD OF THE INVENTION
The present invention relates to visual processing systems and methods. In particular, the present invention relates to a system and method for determining area of a wall with respect to its visualization preferences.

BACKGROUND OF THE INVENTION
End users and painting service providers are following the traditional age- old process of site visit for consultation, selection of colour-schemes & type of paint, measurement taking, calculations and quotations in an interior/exterior painting project that too without any 2D or 3D visualization of the space which could have helped in better decision-making for the customer. End user have no control over the project, as they need to depend on contractors to visit site, take measurements and suggest paint-types, texture options, stencils, rate-card for services depending on type of paint or texture chosen, taking site measurements to determine square feet area for paintable space, providing quotation, that too lack clarity & transparency. This is offline process, which take a lot of time & efforts. The service provider suffers more as he needs to put in this time, effort & incurred cost without any confirmation of project. It takes 2-7 days’ time for this process. With lack of options, end user is forced to get convinced with reference images from Google and hand written calculation by contractor with rough estimate. It’s impossible for a contractor to provide 3D rendered view of the wall with new colour scheme or texture selected by customer in order to get approval, for his actual site as it takes a lot of extra work, cost and effort as he need to get it outsourced. Also, the customer is never willing to pay extra for any visualization whether 3D or 2D presentation hence, he avoids it and customer has to accept the fact. Both the parties need a solution for this problem serving in favour of both parties. The major challenges a user face is:
i) He has to put time and effort in scanning & locating corners of the wall, moving phone camera from one corner to another where there is no option to similarly scan corners of the window, door & certain objects (which are permanently fixed on wall) so that such area can be subtracted from the overall wall area as user is calculating only paintable surface area.
ii) While scanning the room there are certain objects like TV, wall-racks, photo-frame, table on the floor which is touching the wall, cot on the floor which is touching the wall, table-lamp lying on table; all these objects block the way of camera while scanning the wall. So, when the user wants to visualize a colour, texture or pattern on wall over app there is no option either to “show” or “hide” these overlapping objects as per visualization need of user.

Existing tools & systems in the market are either for calculating the square feet area of a wall OR for visualizing color & texture over a wall. But, there is no such tool which can help the user to set visualization preferences first so that user can decide on the specific color & design for each wall of a house (or commercial space) and get calculated area details for the paintable space so that user can create a project online and find vendor instantly like how Ola & Uber works.

In order to overcome the above-mentioned drawbacks, there is a need to design a system and method of determining area of a wall with respect to its visualization preferences by saving their time, effort and money making it online, instant and hassle free. End user let’s say house owner don’t need to consult any contractor and can himself visualize & calculate for his new or existing home or office space obtaining all required details like sizes, square feet area, paint material type, texture & stencils, rate & price calculation with breakups, specification drawing for execution, and download from the system within seconds. In addition, for end user, let us say designer or home décor company, they can also use the system same way like customer (house-owner), and obtain the desired data, which they had to depend on consulting a paint contractor otherwise. This saves a lot of time, effort & money for them too as they had to consider it free of cost service along with complete painting project, as customer will not pay for it anyways. Time & effort that it takes to create such data and getting designs approved is hectic for them. That in addition, they had to do it without getting any confirmation or advance payment for the project hence they are never willing to offer. Now due to the developed system, they can offer it for free as they do not have to put in any time, effort or money. Alternatively, they can ask the client to try out it over the system himself or herself and let the service provider know what they liked and finalized, and share the data with them to refer. The rate part can be edited as per the rate of the vendor and rest everything remains constant as the space measurement, paintable area square feet calculation are based on universal formula practised in market, not objectionable vendor to vendor or paint type or texture pattern selection. Even the material brand, rate are selected to modify its calculation.
The technical advancements disclosed by the present invention overcome the limitations and disadvantages of existing and conventional systems and methods.

SUMMARY OF THE INVENTION

The present invention relates to a system and method of determining area of a wall with respect to its visualization preferences.

An object of the present invention is to create a project by finalizing design attributes (design finalization means: paint shade /color, paint type, paint brand, texture, pattern, stencil etc finalization for each wall of a project, as each of these components determine different charge- rates to be used in calculation),
Another object of the present invention is to set up visualization preferences (visualization preferences means: clearly defining the paintable area over scan being able to give instructions for various object frames as "hide, show or remove" as per his choice and needs),

Yet another object of the present invention is to determine dimensions and area calculations of actual paintable surface,

Yet another object of the present invention is to obtain square feet area breakups, apply rates and obtain overall project cost,

Yet another object of the present invention is to develop visualization of applied colour, texture or stencil on existing wall over app of user’s actual wall scan,

Yet another object of the present invention is to illustrate the outlines of the various objects like window, door, TV, rack, wall etc so that user has option to set preferences whether to show, hide or remove from final rendered view, and

Yet another object of the present invention is to provide a cost-effective, simple, fast and hassle-free system.

In an embodiment, a system of determining area of a wall with respect to its visualization preferences, said system comprising: an input module connected with an image capturing device configured to capture a plurality of images of rooms, wherein the image capturing device is connected with a communication network unit configured to transmit the captured images of the rooms to the input module, wherein the captured images of the rooms act as an input image which include at least a portion of walls of the rooms; a receiving unit configured to receive the input image from the input module through the communication network unit, wherein the communication network unit is further configured to transmit inputs from the input module to the receiving unit for constructing outline structure and design of different objects for walls of the rooms; a pre-processing unit connected to the receiving unit configured to scan the received input images of the rooms and remove unwanted frames from the received input images of the rooms, wherein the pre-processing unit is connected to a feature extraction unit configured to extract frames (traces) of the objects of walls from the scanned images; a processing unit connected to the feature extraction unit to map the extracted frames of the objects for calculating an area of the extracted frames and highlighting extracted frames of the objects of the wall for hiding and visualizing each extracted frame of the object of wall of the room with and without extracted frames of the actual wall, wherein the processing unit selects at least a design options for the wall of the room selected from a paint, wallpaper, texture paint, PVC palling and a combination thereof, and calculate area of the wall to make a design on the wall; and a controller connected to the processing unit to provide wall area calculation based on design simulations as per visualization preferences set up.

In an embodiment, a method of determining area of a wall with respect to its visualization preferences, said method comprising: a first step of capturing, by an input module, a plurality of images of rooms, wherein the image capturing device is connected with a communication network unit configured to transmit the captured images of the rooms to the input module, wherein the captured images of the rooms act as an input image which include at least a portion of walls of the rooms; a second step of receiving, by a receiving unit, the captured images of the room, wherein the communication network unit is configured to transmit inputs from the input module to the receiving unit for constructing outline structure and design of different objects for walls of the rooms; a third step of scanning, by a pre-processing unit, the received input images of the rooms and remove unwanted frames from the received input images of the rooms, wherein the pre-processing unit is connected to a feature extraction unit configured to extract frames of the objects of walls from the scanned images; a fourth step of mapping, by a processing unit, the extracted frames of the objects for calculating an area of the extracted frames and highlighting extracted frames of the objects of the wall for hiding and visualizing each extracted frame of the object of wall of the room with and without extracted frames of the actual wall, wherein the processing unit selects at least a design options for the wall of the room selected from a paint, wallpaper, texture paint, PVC palling and a combination thereof, and calculate area of the wall to make a design on the wall; and a fifth step of providing, by a controller, wall area calculation based on design simulations as per visualization preferences set up.

To further clarify advantages and features of the present disclosure, a more particular description of the invention will be rendered by reference to specific embodiments thereof, which is illustrated in the appended drawings. It is appreciated that these drawings depict only typical embodiments of the invention and are therefore not to be considered limiting of its scope. The invention will be described and explained with additional specificity and detail with the accompanying drawings.

BRIEF DESCRIPTION OF FIGURES

These and other features, aspects, and advantages of the present disclosure will become better understood when the following detailed description is read with reference to the accompanying drawings in which like characters represent like parts throughout the drawings, wherein:
Figure 1 illustrates a block diagram of a system of determining area of a wall with respect to its visualization preferences
Figure 2 illustrates a flow diagram of a method of determining area of a wall with respect to its visualization preferences,
Figure 3, illustrates a 3D view of a space with all general possible objects captured in the scene which need to be captured by the device’s camera, in accordance with an exemplary embodiment of the present invention;
Figure 4, illustrates a scanned version of the space showing major outline-traces (or frames) of various objects so that user can set preferences prior visualization, in accordance with an exemplary embodiment of the present invention;
Figure 5, illustrates view of user setting preferences by identifying which object to be shown or hidden during visualization (Like clock or Tv) as well as which to be removed (like door & window), in accordance with an exemplary embodiment of the present invention;
Figure 6, illustrates visual of the specific wall filled with user’s choice of colour or texture (as per visualization preferences set-up) in accordance with an exemplary embodiment of the present invention;
Figure 7, illustrates visual of the specific wall highlighting only paintable portion (ignore other objects which were set as “hidden” or “shown” during visualization preferences), in accordance with an exemplary embodiment of the present invention; and
Figure 8, shows a block diagram showing a hardware structure of the image processing apparatus in accordance with an exemplary embodiment of the present invention.
Further, skilled artisans will appreciate that elements in the drawings are illustrated for simplicity and may not have been necessarily been drawn to scale. For example, the flow charts illustrate the method in terms of the most prominent steps involved to help to improve understanding of aspects of the present disclosure. Furthermore, in terms of the construction of the device, one or more components of the device may have been represented in the drawings by conventional symbols, and the drawings may show only those specific details that are pertinent to understanding the embodiments of the present disclosure so as not to obscure the drawings with details that will be readily apparent to those of ordinary skill in the art having benefit of the description herein.

DETAILED DESCRIPTION:

For the purpose of promoting an understanding of the principles of the invention, reference will now be made to the embodiment illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended, such alterations and further modifications in the illustrated system, and such further applications of the principles of the invention as illustrated therein being contemplated as would normally occur to one skilled in the art to which the invention relates.
It will be understood by those skilled in the art that the foregoing general description and the following detailed description are exemplary and explanatory of the invention and are not intended to be restrictive thereof.
Reference throughout this specification to “an aspect”, “another aspect” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Thus, appearances of the phrase “in an embodiment”, “in another embodiment” and similar language throughout this specification may, but do not necessarily, all refer to the same embodiment.

The terms "comprises", "comprising", or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a process or method that comprises a list of steps does not include only those steps but may include other steps not expressly listed or inherent to such process or method. Similarly, one or more devices or sub-systems or elements or structures or components proceeded by "comprises...a" does not, without more constraints, preclude the existence of other devices or other sub-systems or other elements or other structures or other components or additional devices or additional sub-systems or additional elements or additional structures or additional components.

Unless otherwise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The system, methods, and examples provided herein are illustrative only and not intended to be limiting.
Embodiments of the present disclosure will be described below in detail with reference to the accompanying drawings.
Figure 1 illustrates a block diagram of a system (200) of determining area of a wall with respect to its visualization preferences, said system (200) comprising: an input module (202), an image capturing device (204), a communication network unit (206), a receiving unit (208), a pre-processing unit (210), a feature extraction unit (212), a processing unit (214) and a controller (216).
The input module (202) is connected with an image capturing device (204) configured to capture a plurality of images of rooms, wherein the image capturing device (204) is connected with a communication network unit (206) configured to transmit the captured images of the rooms to the input module (202), wherein the captured images of the rooms act as an input image which include at least a portion of walls of the rooms. The receiving unit (208) is configured to receive the input image from the input module (202) through the communication network unit (206), wherein the communication network unit (206) is further configured to transmit inputs from the input module (202) to the receiving unit (208) for constructing outline structure and design of different objects for walls of the rooms. The pre-processing unit (210) is connected to the receiving unit (208) configured to scan the received input images of the rooms and remove unwanted frames from the received input images of the rooms, wherein the pre-processing unit (210) is connected to a feature extraction unit (212) configured to extract frames of the objects of walls from the scanned images. The processing unit (214) is connected to the feature extraction unit (212) to map the extracted frames of the objects for calculating an area of the extracted frames and highlighting extracted frames of the objects of the wall for hiding and visualizing each extracted frame of the object of wall of the room with and without extracted frames of the actual wall, wherein the processing unit (214) selects at least a design options for the wall of the room selected from a paint, wallpaper, texture paint, PVC palling and a combination thereof, and calculate area of the wall to make a design on the wall; The controller (216) is connected to the processing unit (214) to provide wall area calculation based on design simulations as per visualization preferences set up.
In an embodiment, the processing unit (214) is configured to calculate net design area of the wall by removing the wall area where objects are to be placed.
In an embodiment, the objects are selected from but not limited to window, door, TV, clock, frames and rack.
In an embodiment, the controller (216) is configured to provide certain dimensions of the extracted frame of the object, if object is window, door, TV, wall clock or fixed-wall-unit object.
In an embodiment, the controller (216) is configured to illustrate different frame structures of the room on a device of a user and a contractor by visualizing the different design structure.
In an embodiment, the device of the user and contractor is selected from, but not limited to, mobile phone, watch, neck band, shoulder band etc.
In another embodiment, the processing unit (214) is configured to map the extracted frame of objects for providing options to the user whether to “show/hide” that particular object like TV OR “remove” a frame like window or door so that paint is not applied on that portion as well as that part is subtracted or not, from paintable wall area calculation.
In another embodiment, the controller (216) is configured to provide options about those frames as which one to be shown, hidden or removed while visualizing, wherein the controller (216) is further configured to select frame and take action as per user’s instructions/preferences.
In another embodiment, the processing unit (214) is configured to provide rendered view of the wall with selected paint or texture as per user’s selections on various objects frames of the scanned wall, wherein the processing unit (214) is further configured to provide dimensions and area calculation of paintable surface only, subtracting door and window area from it as well as not subtracting the TV area whether the TV part is chosen as “show” or “hide” during visualization since wall behind TV needs to be painted. Assuming that there was a wooden structure of TV unit (over which a TV is mounted) permanently fixed on wall, user may select this object frame as “remove” instead of “show” so that this object can be shown above painted wall in the final design visualization as well as this object frame area would be subtracted from the overall wall area like that in case of a door or window. The logic behind it is that the wall area beneath this object cannot be painted in actual by a painter hence user would like to subtract it from paintable area.
In another embodiment, the processing unit (214) is further configured to calculate paintable surface area of ceiling a well where user can “hide, show or remove” various objects such as fan, lights, chandeliers etc.
In another embodiment, the processing unit (214) is further configured to provide inputs to a controller (216) which captures the inputs like various dimensions of frames as well as the visualization preferences set-up in order to analyse which all objects to be “shown”, “hidden” or “removed” and hence provide an illustration (visualization) as a result accordingly.

Figure 2 illustrates a flow diagram of a method (100) of determining area of a wall with respect to its visualization preferences, said method (100) comprising:
Step (102) discloses about capturing, by an input module (202), a plurality of images of rooms, wherein the image capturing device (204) is connected with a communication network unit (206) configured to transmit the captured images of the rooms to the input module (202), wherein the captured images of the rooms act as an input image which include at least a portion of walls of the rooms;
Step (104) discloses about receiving, by a receiving unit (208), the captured images of the room, wherein the communication network unit (206) is configured to transmit inputs from the input module (202) to the receiving unit (208) for constructing outline structure and design of different objects for walls of the rooms;
Step (106) discloses about scanning, by a pre-processing unit (210), the received input images of the rooms and remove unwanted frames from the received input images of the rooms, wherein the pre-processing unit (210) is connected to a feature extraction unit (212) configured to extract frames of the objects of walls from the scanned images;
Step (108) discloses about mapping, by a processing unit (214), the extracted frames of the objects for calculating an area of the extracted frames and highlighting extracted frames of the objects of the wall for hiding and visualizing each extracted frame of the object of wall of the room with and without extracted frames of the actual wall, wherein the processing unit (214) selects at least a design options for the wall of the room selected from a paint, wallpaper, texture paint, PVC palling and a combination thereof, and calculate area of the wall to make a design on the wall; and
Step (110) discloses about providing, by a controller (216), wall area calculation based on design simulations as per visualization preferences set up.
In an embodiment, calculating, by the processing unit (214), net design area of the wall by removing the wall area where objects are to be placed.
In an embodiment, providing certain dimensions of the extracted frame of the object by the processing unit (214), if object is window, door, TV, wall clock or fixed-wall-unit object.

In an embodiment, illustrating different frame structures of the room on device of the user and contractor by visualizing the different design structure.
In an embodiment, the method 100 includes of determining area of a wall with respect to its visualization preferences in accordance with an embodiment of the present invention. The inventive steps in method 100 are highlighted in Step 108, 110 & 112 namely processing unit and controller where the user is able to set up visualization preferences as per his need in order to finalize & create a project; based on which the system will determine dimensions & area calculation of paintable surface along with supporting visual (here final visual means the frame layout illustrating unit provides illustrated drawings & specification sheet) for reference.
Figure 3, illustrates 3D view of an actual space with all general possible objects captured in the scene which need to be captured by the device’s camera, in accordance with an exemplary embodiment of the present invention. The space can be a single wall or ceiling or multiple walls & ceilings of a residential or commercial space. Every user may have different set of objects in such a scene being captured by the image capturing device (for example objects can be chair, table, TV, wall racks, cock, table fan, table or wall mounted light, tube light, bulb, painting, bag hanging on wall hook, cot etc where these objects might be on wall or somewhere between the wall and the camera).
Figure 4, illustrates a scanned version of the space showing major outline-traces (or frames) of various objects so that user can set preferences prior visualization, in accordance with an exemplary embodiment of the present invention.
In an embodiment, the frame structures not necessarily mean to be a square or rectangular outline structure but can also be irregular shape outline trace of specific objects for a better quality of visualization. For example, the scan of a table lamp lying on the table in front of the wall, here its outline trace should be irregular shape outline trace as per its actual shape observed by the camera for a superior quality of visualization while putting user’s choice of paint or texture on the wall (because this object might need to be “shown”, not “hidden” which means it should be clearly defined overlapping the applied paint for a clear visual).
Figure 5, illustrates view of user setting preferences by identifying which object to be shown or hidden during visualization (Like clock or Tv) as well as which to be removed (like door & window), in accordance with an exemplary embodiment of the present invention. Being one of the most crucial inventive step, this option enables a user to visualize whether it’s an empty new house with empty white walls where he just got possession or an existing house where he is living with a cot, table, wall clock etc on or around an already coloured wall. In case user is planning to shift a watch or table from current positions for a new look & transformation of the wall, he may set visualization preference as “hide” for clock and table. So that when user visualizes the wall with new paint or texture over app, clock and table should disappear from the final visual. Similarly user may choose to retain a TV on existing wall while visualizing over app setting up visualization preference as “show”. In both cases of “show” or “hide” those areas will be calculated and considered as paintable surface area by the app.
Figure 6, illustrates visual of the specific wall filled with user’s choice of colour or texture (as per visualization preferences set-up) in accordance with an exemplary embodiment of the present invention.
Now one can notice from this figure that for which all object frames, it was selected as ”show” they appear in the final visual along with new applied color or texture on wall. Whereas whichever objects where marked as ”hide” have disappeared from the final visual. This gives the right level of satisfaction that a user requires in order finalizing designs, crating project, getting area & cost calculation to find vendor online.

Figure 7, illustrates visual of the specific wall highlighting only paintable portion (ignore other objects which were set as “hidden” or “shown” during visualization preferences), in accordance with an exemplary embodiment of the present invention. One can clearly understand from Figure 6 & 7 that based on visualization preferences, the app is able to provide final visuals which satisfies the user, serving his need as well as the actual paintable area is shown separately for the reference of user & vendor for its area & cost calculation (this visual is required while executing the project at site to avoid any miscommunication). If a user and vendor refers to the visual obtained as in Figure 6, it would create confusion as the area behind objects marked as “show” is captured in scope or paintable area work or not. Whereas in Figure 7, both parties clearly understand as which portion of wall to be painted or not. Although the visual obtained in Figure7 was not sufficient for the user in order to finalize designs, hence there is need of both type of visuals in Figure 6 & 7 serving for different purposes.

Figure 8, shows a block diagram showing a hardware structure of the image processing apparatus in accordance with an exemplary embodiment of the present invention.
The image processing apparatus 702 (a processing unit 214 as referred in figure 2) is implemented by a MFP that is capable of performing various types of processing including, for example, scanning operation of scanning an original document into scanned image data; copying operation of printing the scanned image data onto a recording sheet to output a printed sheet; distribution operation of distributing image data through a network such as a local area network; and facsimile transmission operation of transmitting image data via facsimile.

The image processing apparatus 702 of Figure 5 mainly includes a controller 704, an operation panel 720, a converter 722, a facsimile control unit (FCU) 724, and an engine 726.

The controller 704 controls entire processing performed by the image processing apparatus 702. The controller 704 includes a central processing unit (CPU) 706, an application specific integrated circuit (ASIC) 708, an image data buffer 710, a secondary storage 712, a read only memory (ROM) 714, and a network interface (I/F) 716.

The CPU 706 is a processor that controls processing performed by the image processing apparatus 702. For example, the CPU 706 deploys a control program such as the image processing program stored in the ROM 714 onto the RAM to perform various processing including the image processing operation that will be described below.

The ASIC 708 is an integrated circuit that performs scanning of the original document, distribution of image data, and printing of image data, for example. When the ASIC 708 receives a processing request that requests execution of various processing, which is instructed by a user through the operation panel 720, the ASIC 708 causes any one of the converter 722, FCU 724, and engine 726 to perform the requested processing, by transmitting a control signal through a peripheral component interconnect (PCI) bus 718. In this example, the ASIC 708 is assumed to perform various processing through at least one of the converter 722, FCU 724, and engine 726. Alternatively, the ASIC 708 may deploy specific programs onto the RAM to execute various processing according to the processing request.

The image data buffer 710 is a data buffer that temporarily stores image data to be processed. The image data buffer 710 may be implemented by a volatile memory such as RAM, for example. In this example, the image data that is generated by scanning the original document, which may be referred to as the scanned image data, is stored in the secondary storage 712 through the image data buffer 710. The scanned image data, which is read out from the secondary storage 712, is converted so as to have a specific format to generate converted image data. The converted image data, which is generated by the converter 722, is buffered into the image data buffer 710 and stored in the secondary storage 712, on a first-in first-out (FIFO) basis. The ASIC 708 performs various processing using the converted image data such as printing, facsimile transmission, or network distribution.

The secondary storage 712 is a storage device that stores the scanned image data and the converted image data. The secondary storage 712 may be alternatively referred to as a supplementary storage. When storing the scanned image data or the converted image data, the ASIC 708 generates identification information (“image data identification information”) that is unique to the image data being stored, in the form of metadata that is associated with the image data being stored. The ASIC 708 uses this image data identification information to obtain, from the secondary storage 712, the scanned image data or the converted image data that matches the process requested by the processing request. In this example, the secondary storage 712 may be implemented by a nonvolatile memory such as a HDD or flash memory.

The network I/F 716 functions as an interface with the outside network such as a LAN network or the Internet. The network I/F 716 transmits the converted image data to the outside network according to an instruction received from the ASIC 708. The network I/F 716 may receive various data such as data to be printed from the outside network, for example, from an information processing apparatus that generates the data to be printed.

The operation panel 720 allows the user to select, from various processing that can be provided by the image processing apparatus 702, one or more processing to cause the image processing apparatus 702 to perform the selected processing. The operation panel 720 displays thereon processing that can be provided by the image processing apparatus 702. Upon selection of specific processing and settings information that further specifies how the selected processing is to be performed by the user, one or more processing requests each requesting execution of selected processing are transmitted to the ASIC 708 together with information regarding a type of the selected processing and settings information. In this example, the operation panel 720 is implemented by a touch panel screen.

More specifically, the operation panel 720 displays thereon a plurality of buttons or keys, which may be collectively referred to as keys. The keys include a key for instructing copying of the original document, a key for instructing facsimile transmission of image data of the original document, a key for instructing network distribution of image data of the original document, etc. When the key for copying is selected, the user is able to select or specify a number of pages to be copied, a color of the printed image being output, a recording sheet size, image quality of the printed image, encoding format of image data, enlarged size ratio, reduction size ratio, etc., as settings information for copying. When the key for facsimile transmission is selected, the user is able to select or specify a telephone number to which facsimile data is sent, image quality of facsimile data, encoding format of facsimile data, re-transmission option indicating whether to re-transmit in case of error, etc., as settings information for facsimile transmission. When the key for network distribution is selected, a file path or an email address that identifies a destination to which image data is sent, encoding format of image data, etc., as settings information for network distribution.

The operation panel 720 further requests the user to select whether to store image data being processed, when a plurality of processing requests are to be sequentially performed. In one example, the operation panel 720 may provide a check box that can be selected or unselected by the user. In another example, the operation panel 720 may be caused to display a message that asks the user whether to store image data, only when more than one processing requests are instructed.
In alternative to instructing processing to be performed through the operation panel 720, the user may instruct the image processing apparatus 702 to perform specific processing by sending one or more processing requests through the information processing apparatus via the network. In such case, the processing request received at the network I/F 716 is sent to the ASIC 708 for further processing.

For the descriptive purposes, in this example, any one of the operation panel 720and the network interface 716 may be collectively referred to as a user interface that provides the function of receiving one or more processing requests from the user.

The converter 722 converts a data format of the scanned image data of the original document to generate converted image data having a data format that matches the requested processing, according to an instruction received from the ASIC 708. In this example, the converter 722 is implemented by an ASIC that is specially designed for data conversion. Alternatively, the functions of the converter 722 may be at least partially performed by a data conversion program. In this example illustrated in FIG. 1, the converter 722 is provided independently from the controller 704. Alternatively, the controller 704 may incorporate therein the converter 722.

The FCU 724 transmits image data of the original document via facsimile. The FCU 724 transmits the converted image data that is generated by the converter 722through a telephone network, according to an instruction received from the ASIC 708.

The engine 726 provides functions of scanning the original document or printing image data of the original document. The engine 726 includes an ASIC 728, scanner 730, and printer 732.

The ASIC 728 is an integrated circuit that performs various processing such as scanning of the original document or printing of image data of the original document. For example, according to an instruction received from the controller 704, the ASIC 728 causes the scanner 730 to scan the original document into scanned image data or causes the printer 732 to print the scanned image data on a recording sheet. Alternatively, any of the functions provided by the ASIC 728 may be performed by a control program that is deployed onto the RAM.

The scanner 730 scans the original document into scanned image data. The scanner 730, which is implemented by any desired scanner, includes an optical system and a charged coupled device (CCD) sensor, for example. According to an instruction for scanning that is received from the ASIC 728, the scanner 730 scans light toward the original document surface to form an optical image on the CCD sensor. The CCD sensor converts the optical image formed thereon to an electrical signal. The scanner 730 further applies image processing such as analog-to-digital conversion to the electrical signal to generate scanned image data. The scanned image data may be further applied with correction processing such as gamma correction. The scanner 730 outputs the scanned image data to the controller 704through the PCI bus 718.

The printer 732 prints image data of the original document, which is obtained by converting the scanned image data, onto a recording sheet. According to an instruction received from the ASIC 728, the printer 732 prints the image data onto the recording sheet based on settings information included in the processing request input by the user through the operation panel 720. In case of copying, the printer 732 obtains the scanned image data generated by the scanner 730 and stored in the secondary storage 712, and prints the scanned image data onto the recording sheet. In case of printing, the printer 732 obtains image data to be printed from the network through network I/F 716, and prints the image data onto the recording sheet.

The drawings and the forgoing description give examples of embodiments. Those skilled in the art will appreciate that one or more of the described elements may well be combined into a single functional element. Alternatively, certain elements may be split into multiple functional elements. Elements from one embodiment may be added to another embodiment. For example, orders of processes described herein may be changed and are not limited to the manner described herein. Moreover, the actions of any flow diagram need not be implemented in the order shown; nor do all of the acts necessarily need to be performed. Also, those acts that are not dependent on other acts may be performed in parallel with the other acts. The scope of embodiments is by no means limited by these specific examples. Numerous variations, whether explicitly given in the specification or not, such as differences in structure, dimension, and use of material, are possible. The scope of embodiments is at least as broad as given by the following claims.
Benefits, other advantages, and solutions to problems have been described above with regard to specific embodiments. However, the benefits, advantages, solutions to problems, and any component(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature or component of any or all the claims. , Claims:1. A system (200) of determining area of a wall with respect to its visualization preferences, said system (200) comprising:
an input module (202) connected with an image capturing device (204) configured to capture a plurality of images of rooms, wherein the image capturing device (204) is connected with a communication network unit (206) configured to transmit the captured images of the rooms to the input module (202), wherein the captured images of the rooms act as an input image which include at least a portion of walls of the rooms;
a receiving unit (208) configured to receive the input image from the input module (202) through the communication network unit (206), wherein the communication network unit (206) is further configured to transmit inputs from the input module (202) to the receiving unit (208) for constructing outline structure and design of different objects for walls of the rooms;
a pre-processing unit (210) connected to the receiving unit (208) configured to scan the received input images of the rooms and remove unwanted frames from the received input images of the rooms, wherein the pre-processing unit (210) is connected to a feature extraction unit (212) configured to extract frames of the objects of walls from the scanned images;
a processing unit (214) connected to the feature extraction unit (212) to map the extracted frames of the objects for calculating an area of the extracted frames and highlighting extracted frames of the objects of the wall for hiding and visualizing each extracted frame of the object of wall of the room with and without extracted frames of the actual wall, wherein the processing unit (214) selects at least a design options for the wall of the room selected from a paint, wallpaper, texture paint, PVC palling and a combination thereof, and calculate area of the wall to make a design on the wall; and
a controller (216) connected to the processing unit (214) to provide wall area calculation based on design simulations as per visualization preferences set up.
2. The system as claimed in claim 1, wherein the processing unit (214) is configured to calculate net design area of the wall by removing the wall area where objects are to be placed.
3. The system as claimed in claim 2, wherein the objects are selected from but not limited to window, door, TV, clock, frames and rack.
4. The system as claimed in claim 1, wherein the controller (216) is configured to provide curtain dimensions of the extracted frame of the object, if object is window, door, TV, wall clock or fixed-wall-unit object.
5. The system as claimed in claim 1, wherein the controller (216) is configured to illustrate different frame structures of the room on a device of a user and a contractor by visualizing the different design structure.
6. The system as claimed in claim 5, wherein the device of the user and contractor is selected from, but not limited to, mobile phone, watch, neck band, shoulder band etc.
7. A method (100) of determining area of a wall with respect to its visualization preferences, said method (100) comprising:
capturing, by an input module (202), a plurality of images of rooms, wherein the image capturing device (204) is connected with a communication network unit (206) configured to transmit the captured images of the rooms to the input module (202), wherein the captured images of the rooms act as an input image which include at least a portion of walls of the rooms;
receiving, by a receiving unit (208), the captured images of the room, wherein the communication network unit (206) is configured to transmit inputs from the input module (202) to the receiving unit (208) for constructing outline structure and design of different objects for walls of the rooms;
scanning, by a pre-processing unit (210), the received input images of the rooms and remove unwanted frames from the received input images of the rooms, wherein the pre-processing unit (210) is connected to a feature extraction unit (212) configured to extract frames of the objects of walls from the scanned images;
mapping, by a processing unit (214), the extracted frames of the objects for calculating an area of the extracted frames and highlighting extracted frames of the objects of the wall for hiding and visualizing each extracted frame of the object of wall of the room with and without extracted frames of the actual wall, wherein the processing unit (214) selects at least a design options for the wall of the room selected from a paint, wallpaper, texture paint, PVC palling and a combination thereof, and calculate area of the wall to make a design on the wall;and
providing, by a controller (216), wall area calculation based on design simulations as per visualization preferences set up.
8. The method as claimed in claim 7, wherein calculating, by the processing unit (214), net design area of the wall by removing the area of those object-frames for which visualization preference were set to remove.

9. The method as claimed in claim 7, wherein providing curtain dimensions of the extracted frame of the object by the processing unit (214), if object is window, door, TV, wall clock or fixed-wall-unit object.
10. The method as claimed in claim 7, wherein illustrating different frame structures of the room on device of the user and contractor by visualizing the different design structure.

Documents

Orders

Section Controller Decision Date
15 and 43(1) Pritish Ranjan Pradhan 2024-05-27
15 and 43(1) Pritish Ranjan Pradhan 2024-07-11
15 and 43(1) Pritish Ranjan Pradhan 2024-07-12

Application Documents

# Name Date
1 202341086088-STATEMENT OF UNDERTAKING (FORM 3) [16-12-2023(online)].pdf 2023-12-16
2 202341086088-FORM FOR STARTUP [16-12-2023(online)].pdf 2023-12-16
3 202341086088-FORM FOR SMALL ENTITY(FORM-28) [16-12-2023(online)].pdf 2023-12-16
4 202341086088-FORM 1 [16-12-2023(online)].pdf 2023-12-16
5 202341086088-FIGURE OF ABSTRACT [16-12-2023(online)].pdf 2023-12-16
6 202341086088-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [16-12-2023(online)].pdf 2023-12-16
7 202341086088-EVIDENCE FOR REGISTRATION UNDER SSI [16-12-2023(online)].pdf 2023-12-16
8 202341086088-DRAWINGS [16-12-2023(online)].pdf 2023-12-16
9 202341086088-DECLARATION OF INVENTORSHIP (FORM 5) [16-12-2023(online)].pdf 2023-12-16
10 202341086088-COMPLETE SPECIFICATION [16-12-2023(online)].pdf 2023-12-16
11 202341086088-STARTUP [18-12-2023(online)].pdf 2023-12-18
12 202341086088-Proof of Right [18-12-2023(online)].pdf 2023-12-18
13 202341086088-FORM28 [18-12-2023(online)].pdf 2023-12-18
14 202341086088-FORM-9 [18-12-2023(online)].pdf 2023-12-18
15 202341086088-FORM-26 [18-12-2023(online)].pdf 2023-12-18
16 202341086088-FORM 18A [18-12-2023(online)].pdf 2023-12-18
17 202341086088-FER.pdf 2024-02-07
18 202341086088-OTHERS [28-03-2024(online)].pdf 2024-03-28
19 202341086088-FER_SER_REPLY [28-03-2024(online)].pdf 2024-03-28
20 202341086088-DRAWING [28-03-2024(online)].pdf 2024-03-28
21 202341086088-CLAIMS [28-03-2024(online)].pdf 2024-03-28
22 202341086088-US(14)-HearingNotice-(HearingDate-02-05-2024).pdf 2024-04-02
23 202341086088-Correspondence to notify the Controller [25-04-2024(online)].pdf 2024-04-25
24 202341086088-FORM-26 [27-04-2024(online)].pdf 2024-04-27
25 202341086088-Written submissions and relevant documents [17-05-2024(online)].pdf 2024-05-17
26 202341086088-PatentCertificate12-07-2024.pdf 2024-07-12
27 202341086088-IntimationOfGrant12-07-2024.pdf 2024-07-12

Search Strategy

1 SearchHistory(57)E_23-01-2024.pdf

ERegister / Renewals