Abstract: A designing assistive device for buildings, comprising a cuboidal body 101 positioned on the ground surface of an under-construction building with an inbuilt user-interface for wirelessly inputting commands to create a three-dimensional map of the building, a computing unit that processes these commands, a microcontroller wirelessly linked to the computing unit determines the real-time location of the device using GPS and navigates the body 101 to a user-defined location using motorized tracked wheels 102, an AI-based imaging unit 103 mounted on a robotic arm 104 to capture images and create the 3D map displayed on the computing unit, a holographic projection unit 105 visualizes the design in 3D, a microphone 106 records noise levels to suggest interior spaces, a pneumatic-powered gripper 108 places markers in designated sections, and an ultrasonic sensor 109 checks wall thickness against predefined standards alerting the user if the thickness is inadequate.
Description:FIELD OF THE INVENTION
[0001] The present invention relates to a designing assistive device for buildings that enables users to create an accurate, interactive 3D map of a building under construction and allowing the visualization and modification of different areas for interior spaces.
BACKGROUND OF THE INVENTION
[0002] Designing a building is a critical process that combines creativity, functionality, and sustainability to create spaces that meet the needs of their occupants while contributing to the built environment. Effective building design starts with understanding the purpose and function of the space whether it’s residential, commercial, or industrial. Architects and designers carefully consider factors such as the building's layout, accessibility, safety, and aesthetic appeal. Incorporating natural light, energy efficiency, and sustainable materials is crucial in minimizing environmental impact and reducing long-term operational costs. The design process also involves compliance with local regulations, zoning laws, and building codes for ensuring that the structure is safe and suitable for its intended use. A well-designed building fosters a sense of well-being for its inhabitants, promoting comfort, productivity, and a connection to the surrounding community. As cities grow and urbanization continues, thoughtful building design aids in shaping the future of living and working spaces, balancing innovation with tradition.
[0003] Traditional methods of building design often rely on manual drafting, physical models, and a limited use of computer-aided design (CAD) tools. Architects typically worked with hand-drawn blueprints, relying on their expertise and experience to create layouts, structural elements, and materials suited to the intended purpose. While this method allowed for a personalized, hands-on approach, these also had significant drawbacks. One of the primary limitations is the time-consuming nature of hand-drawing plans that lead to inefficiencies and errors. Traditional methods lacked the precision and flexibility of modern digital tools which result in inaccuracies that not become apparent until later stages of construction. The reliance on physical models also makes difficult to modify designs quickly in view of resulting in longer project timelines and higher costs. Traditional methods do not prioritize sustainability or energy efficiency, as these concepts are emphasized before the rise of design technologies. The lack of real-time collaboration between teams lead to miscommunication and delays, further impacting the overall efficiency.
[0004] US20030023412A1 discloses about an invention that method and system for creating, using, and managing a three-dimensional digital model of the physical environment combines outdoor terrain elevation and land-use information, building placements, heights and geometries of the interior structure of buildings, along with site-specific models of components that are distributed spatially within a physical environment. The present invention separately provides an asset management system that allows the integrated three-dimensional model of the outdoor, indoor, and distributed infrastructure equipment to communicate with and aggregate the information pertaining to actual physical components of the actual network, thereby providing a management system that can track the on-going performance, cost, maintenance history, and depreciation of multiple networks using the site-specific unified digital format.
[0005] US2016300293A1 discloses about an invention that has aEmbodiments of the present disclosure are directed to methods, systems and devices for designing a commercial or residential space via a design application. For example, in some embodiments, a method is disclosed which enables a user to input information, including, for example, photos or video of the space, lighting, color(s). user preferences, measurements and the configuration and/or location of openings in the space. In such embodiments, the user can select a design theme, style or designer, and based on the information input (or acquired), the method presents recommendations of a new design for the space, which may include recommendations of products to furnish the space. Further embodiments also include enabling the user to purchase such products, and may also allow the user to hire service personnel to construct the recommended design and/or install selected/purchased products.
[0006] Conventionally, many methods are available for designing a building. However, the cited invention lacks the ability to integrate real-time data collection and interaction during the construction process. While these conventional methods provide a framework for initial design, they do not facilitate continuous adjustments or on-site modifications based on real-world conditions. They lack autonomous navigation and location tracking features which allow for more efficient on-site design implementation which is crucial for creating a comprehensive, user-responsive design process.
[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a device that is capable of enabling users to not only visualize and plan a building's design in a comprehensive and interactive manner but also provide real-time, on-site support for construction adjustments. The developed device should enable the continuous monitoring and adjustment of building elements based on real-time data, thus offering an immersive experience to preview the final result, thereby enhancing the building process with higher accuracy, efficiency, and responsiveness to the evolving construction environment.
OBJECTS OF THE INVENTION
[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.
[0009] An object of the present invention is to develop a device that is capable of enabling users to generate an accurate three-dimensional map of a building under construction based on input commands provided by the user.
[0010] Another object of the present invention is to develop a device that is capable of facilitating real-time location tracking of the device within the building site for allowing autonomous navigation to reach specified locations based on user-defined coordinates.
[0011] Another object of the present invention is to develop a device that is capable of capturing detailed images of construction areas for the purpose of preparing a comprehensive 3D map, which aids users in visualizing and marking areas for different interior spaces.
[0012] Another object of the present invention is to develop a device that is capable of providing an immersive visualization of the construction design which helps users understand how the area would look once completed.
[0013] Another object of the present invention is to develop a device that is capable of monitoring and analyzing surrounding noise levels in different sections of the building and suggest appropriate interior spaces based on acoustic requirements.
[0014] Yet another object of the present invention is to develop a device that is capable of automating the placement of markers at designated areas within the building site for ensuring that construction teams easily follow the user's design instructions.
[0015] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.
SUMMARY OF THE INVENTION
[0016] The present invention relates to a designing assistive device for buildings that is capable of autonomously navigating within a construction site by incorporating real-time location tracking and precise route determination based on user-defined coordinates for ensuring the device reaches specified locations seamlessly in view of preparing 3-D map that enables the user to understand how the selected area would look upon construction.
[0017] According to an embodiment of the present invention, a designing assistive device for buildings, comprises of a cuboidal body that is positioned on the ground surface of the construction site. The body is wirelessly connected to a computing unit via a user-interface, thus enabling the user to input commands for generating the 3D map and specifying the location to be designed. A microcontroller, linked to the computing unit through communication modules such as Wi-Fi (Wireless Fidelity), Bluetooth, GSM (Global System for Mobile Communication), processes these inputs and activates a GPS module for real-time location tracking of the device. The microcontroller determines the optimal route to the user-defined location and actuates motorized tracked wheels to navigate the device accordingly. An AI-based imaging unit mounted on a robotic arm, which captures images of the area to create the 3D map displayed on the computing unit. A holographic projection unit allows the user to visualize the design in 3D and a microphone records environmental noise levels for helping the microcontroller suggest appropriate interior spaces for different areas. The device is equipped with a chamber containing plastic marking units, which are placed in user-defined locations by a telescopic gripper powered by a pneumatic unit. An ultrasonic sensor works with the imaging unit to measure wall thickness and alerting the user if the thickness is below the required standard. The device is powered by a battery to operate its electrical and electronic components.
[0018] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGS
[0019] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of a designing assistive device for buildings.
DETAILED DESCRIPTION OF THE INVENTION
[0020] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.
[0021] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.
[0022] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.
[0023] The present invention relates to a designing assistive device for buildings that facilitates comprehensive construction planning by capturing detailed images of the building areas, projecting immersive holograms of the 3D map, and automating the placement of markers while analyzing environmental factors such as noise levels to suggest appropriate construction modifications.
[0024] Referring to Figure 1, an isometric view of a designing assistive device for buildings is illustrated, comprising a cuboidal body 101 positioned on a ground surface, motorized tracked wheels 102 arranged underneath the body 101, an artificial intelligence-based imaging unit 103 installed on the body 101 via a robotic arm 104, a holographic projection unit 105 is installed on the body 101, a microphone 106 positioned on the body 101, a chamber 107 stored with multiple plastic marking units, installed with the body 101, a telescopically operated gripper 108 arranged with the body 101 and an ultrasonic sensor 109 arranged with the body 101.
[0025] The device disclosed herein includes a cuboidal body 101 which is placed on a ground surface of an under-construction building only having side walls. The body 101 serves as the physical housing for all the components that work together to assist in the design and mapping of the building's layout. The cuboidal shape is chosen to provide stability and ample space for the integration of various elements. The body is developed to be compact yet robust that is capable of enduring the rigors of a construction site while ensuring that the device move smoothly and interact with the construction environment.
[0026] A user-interface inbuilt in a computing unit that allows users, such as architects, engineers, or project managers, to input commands that guide the device in mapping the building's layout. The computing unit process the inputs from the user and control the overall functioning of the device. The computing unit is wirelessly connected to the device for ensuring that the user input commands from a distance, without the need for physical connections or proximity to the device. The wireless communication between the computing unit and the device ensures that data and commands are transmitted effectively between the user and the device.
[0027] This connection is facilitated through a communication module that include but not limited to Wi-Fi (Wireless Fidelity), Bluetooth, or GSM (Global System for Mobile Communication), thus allowing the user to interact with the device remotely. The user input commands via a graphical user interface (GUI) displayed on the computing unit, where user specify a range of actions such as generating a 3D map of the construction site, identifying the areas to be mapped, or defining specific locations within the building to be designed. The device is pre-fed to interpret these commands and execute them accordingly for creating an interactive and user-friendly experience.
[0028] A microcontroller is wirelessly linked to the computing unit that processes the input commands received from the user through the computing unit and directs the device to execute specific tasks. The microcontroller acts as the central processing unit of the device that translates user instructions into actionable steps, such as navigating the device to a specific location or initiating scanning functions. The computing unit sends commands to the microcontroller, which interprets these commands and sends back feedback or initiates actions accordingly. This wireless connection is achieved using communication modules that support different types of wireless technologies which includes but not limited to Wi-Fi, Bluetooth, and GSM. Wi-Fi enables the device to communicate over longer distances and is particularly useful when the building's construction site is equipped with a network infrastructure. Bluetooth, on the other hand, allows for short-range communication between the computing unit and the device when the user is nearby. The GSM module provide cellular connectivity for communication over even larger distances, useful for remote monitoring and operation, especially when the construction site is expansive or in areas with limited Wi-Fi availability.
[0029] In addition to these communication modules, a GPS (Global Positioning System) module determines the real-time location of the device within the construction site. The GPS module is installed on the body 101 and continuously tracks the position of the cuboidal body 101 as the body 101 moves across the ground surface. This real-time location detection ensures that the device knows where it is at any given moment within the construction site. This is vital for tasks such as mapping out specific locations, guiding the device to precise user-defined points, or maneuvering the device along a predetermined route. The GPS module allows the device to operate autonomously, as the GPS calculate its exact coordinates and compare them with the user’s input location to ensure it follows the correct path.
[0030] Once the location is determined through GPS, the microcontroller processes this data and compares it with the user's input commands for the user-defined location. The device calculates the best possible route the device needs to follow to reach that location which is determined by analyzing the construction site’s layout, any potential obstacles, and the most efficient path to the target destination. This allows the device to navigate the construction site autonomously without requiring constant manual intervention from the user. The route calculation takes into account factors like available space for movement in view of ensuring that the cuboidal body 101 travel smoothly and safely from one location to another.
[0031] To achieve this autonomous navigation, the device is equipped with motorized tracked wheels 102 arranged underneath the cuboidal body 101. These wheels 102 are controlled by the microcontroller and provide the device with the necessary mobility to move along the determined route. The tracks allow the device to handle various terrains commonly found in construction sites, such as uneven surfaces, dirt, or debris. The motorized tracked wheels 102 enable precise movement and ensure that the device follow the pre-determined path with accuracy. These wheels 102 are driven by motors that are directly controlled by the microcontroller which processes the input commands related to movement and adjusts the wheel’s speed and direction as needed.
[0032] The motorized tracked wheels 102 allow for smooth and reliable maneuvering of the cuboidal body 101 across the construction site. The microcontroller receives data from the GPS module to continuously adjust the device's movement and ensure that the body 101 stays on course and ultimately reach the user-defined location. As the device moves along the determined route, the wheels 102 allow the body 101 to travel with ease even in challenging construction environments. In situations where the terrain are difficult or obstructed, the body 101 adapt and make necessary adjustments to its path for ensuring the device reaches its destination without being hindered by the site conditions.
[0033] An artificial intelligence-based imaging unit 103 is paired with a processor and mounted on the body 101 via a robotic arm 104. The imaging unit 103 aids in collecting data from the construction site and creating a highly detailed three-dimensional (3D) map of the area under construction while the body 101 maneuvers throughout the building. The imaging unit 103 is equipped with imaging technologies, likely including high-resolution camera or 3D scanning which are capable of capturing images of the user-defined area from multiple angles. These images are then processed using AI (artificial intelligence) protocols to stitch them together into a 3D map that accurately represents the physical space of the building.
[0034] The artificial intelligence (AI) in the imaging unit 103 significantly enhances the efficiency and accuracy of the mapping process. AI protocols are employed to automatically analyze the captured images, identify important features within the environment, and correct any distortions or inconsistencies in the data. The AI imaging unit 103 is able to detect and account for obstacles, varying surface textures, and changes in depth, ensuring that the resulting 3D map is both accurate and comprehensive. The AI is also used to identify the boundaries of different sections within the building, such as rooms, walls, or architectural features, helping to create a precise layout of the construction area. The artificial intelligence (AI) in the imaging unit 103 ensures that the process of creating a 3D map is fast, accurate, and automated which minimizes human error and improving the overall design process.
[0035] The robotic arm 104 aids in positioning the imaging unit 103 to capture data from various perspectives. The arm 104 is highly flexible and capable of moving the imaging unit 103 in multiple directions such as up, down, left, right, and possibly even rotating to provide a 360-degree view of the area being mapped. This movement is particularly beneficial in large construction sites where a fixed camera is not able to capture all necessary angles. The robotic arm 104, controlled by the microcontroller ensures that the imaging unit 103 is precisely positioned for each shot which allows the imaging unit 103 to capture detailed images of the entire user-defined area. This is important because the 3D map requires a complete and comprehensive set of data points, and the robotic arm 104 enables the device to capture every angle necessary for creating a complete digital representation of the space.
[0036] Once the images are captured, the data is processed and converted into a 3D map that is displayed on the device’s computing unit. The map represents the building's layout in three dimensions for providing the user with an interactive and immersive view of the construction site. The 3D map is not just a simple static image but is developed to be dynamic which allows the user to zoom in, rotate, and interact with the model to examine different sections of the building. Through the computing unit, users also mark specific areas in the 3D map where interior spaces refers to kitchen, bedroom, dining hall, drawing room, washroom that need to be planned or constructed. This gives the user the ability to actively participate in the design process and makes real-time changes to the layout and visualizing how different elements would fit into the overall structure.
[0037] The cuboidal bod is equipped with a holographic projection unit 105 which is activated by the microcontroller. Once the 3D map is prepared and displayed on the computing unit, the holographic projection unit 105 create a hologram of this map for effectively projecting the 3D image of the building’s design into physical space. This allows the user to view the design not just on a screen but in real life for providing a much more immersive experience. The hologram allows the user to visualize how the constructed building would look once completed in view of helping to provide insights into the scale, layout, and aesthetics of the space. By projecting the hologram, the device enables users to explore the construction design from all angles which offers a more intuitive understanding of how different design elements interact with the actual environment.
[0038] A microphone 106 is positioned on the body 101 for capturing auditory data from the construction environment. When activated by the microcontroller, the microphone 106 continuously records the surrounding noise levels in various sections of the building under construction. This is important because the noise levels in a building vary significantly depending on factors like the activities being carried out in specific areas, the presence of machinery and the volume of human activity. The microphone 106 gathers real-time auditory data from different parts of the construction site. This data is then processed by the microcontroller, which is responsible for analyzing the captured sound levels. The device uses sound data to assess the noise characteristics of different sections of the building. For example, some parts of the construction area are noisy due to heavy equipment usage, while others are quieter, perhaps because they are developed for more controlled or quiet activities. The microcontroller then distinguishes these differences by measuring the noise levels across the construction site using the microphone 106 to capture these varying sound intensities.
[0039] Once the sound levels are recorded, the microcontroller interprets this data to provide meaningful suggestions to the user about how to address these differences in noise. Based on the level and characteristics of the recorded sound, the device recommends interior spaces or partitions for different sections of the building. For example, in areas where high noise levels are detected such as near construction equipment, generators, or areas with intense activity, the microcontroller suggest adding soundproof or acoustically insulated walls. These interior spaces help mitigate the noise pollution and making the area more comfortable and safer for workers or occupants in the future. On the other hand, in quieter areas where noise is minimal, the device suggests less sound-dampening interior spaces in view of focusing instead on functionality or aesthetic aspects.
[0040] For example, in areas with low noise levels, such as those located away from high-traffic or construction zones, the microcontroller recommends the layering of a bedroom as an interior space. Bedrooms typically require a peaceful and quiet environment for rest, and in quieter areas, the device prioritize comfort, aesthetics, and privacy over soundproofing. On the other hand, in high noise areas, such as spaces near kitchens, washrooms, or machinery-heavy zones, the microcontroller suggest the layering of these areas with sound-dampening materials. Kitchens and washrooms are naturally more robust spaces that handle some noise, but in high-noise environments, adding additional soundproofing help minimize disruptions and ensure comfort and privacy for those using these spaces.
[0041] By correlating noise levels with specific needs for interior space types, the device optimizes the acoustic environment of the building. This helps ensure that the construction space is not only efficient in terms of spatial design but also customized for user comfort and operational efficiency. For example, areas that are meant for offices or residential spaces are developed with soundproofing or noise-reducing materials to create a peaceful environment, whereas industrial or machinery-heavy areas do not require such extensive soundproofing.
[0042] The noise-based interior spaces suggestions are also valuable for future building functionality. For example, when planning for different types of rooms or areas in a building, such as offices, conference rooms, kitchens, or bathrooms, understanding the noise environment is important for determining the most appropriate materials and structures for each space. For areas prone to high noise, the device suggests interior spaces that incorporate noise-dampening walls, ceilings, or flooring to ensure that the space meets acoustical standards, improving overall occupant comfort and productivity. In quieter spaces, the device suggests using less dense materials to save on cost or energy, as soundproofing is not on priority. As the building progresses and more areas are developed, noise levels change depending on construction activities and the device continuously monitor the noise environment for providing updated recommendations to the construction team or architects in real time that help to adapt the design and construction process based on the evolving acoustic landscape of the building site.
[0043] A chamber 107 is housed with multiple plastic marking units, which are integral to the construction and design process of the building. The chamber 107 is installed on the cuboidal body 101 and is developed to store and distribute these marking units at specific locations as required by the user. The marking units are typically plastic markers that are used to indicate precise positions or boundaries within the building site. These markers serve as visual indicators for the construction team for allowing them to understand exactly where various elements of the building, such as walls and interior spaces are placed or constructed.
[0044] The marking process is carried out with the help of the microcontroller which directs the movement of the device. The microcontroller is wirelessly connected to the computing unit, which receives the user’s input commands. Based on the user’s specifications for the building design and the location of the markings within the construction site, the microcontroller processes these commands and activates the motorized wheels 102 of the device. The motorized wheels 102 allow the device to move autonomously across the construction site, following a route to reach the user-defined sections of the building where markings are needed.
[0045] Once the device reaches the designated area, the microcontroller activates a telescopically operated gripper 108 attached to the body 101. The gripper 108 is used to pick up and place the plastic marking units at precise locations. The gripper 108 is capable of extending and retracting which enabling the gripper 108 to reach areas that are farther away or difficult to access. The ability to adjust its length makes the gripper 108 highly versatile in view of allowing it to work in different parts of the building and place markings at various heights and positions.
[0046] The telescopically operated gripper 108 is powered by a pneumatic unit for providing the necessary force for extension and retraction. The pneumatic unit comprises of an air compressor, air cylinder, air valves, and a piston. The air compressor supplies pressurized air which is then stored in the air cylinder. The air valves control the flow of air into the cylinder which regulates the movement of the piston inside it. The piston, when activated, moves the gripper 108 in and out in view of allowing it to extend or retract based on the specific needs of the task at hand. The pneumatic unit operates in collaboration with the microcontroller, which controls the sequence of actions for the gripper 108.
[0047] When the gripper 108 needs to extend to reach a target location, the microcontroller activates the appropriate air valves, allowing the compressed air to flow into the air cylinder, causing the piston to move and extend the gripper 108. Conversely, when the gripper 108 needs to retract, the microcontroller switches the air valves to release the air pressure for allowing the piston to return to its original position, and the gripper 108 shortens. This provides precise and smooth movement which is crucial for placing the plastic marking units accurately at the user-defined locations.
[0048] Once the gripper 108 reaches the designated position, the gripper 108 picks up a plastic marking unit from the chamber 107 and places it at the specified location on the ground. The microcontroller ensures that the marking is placed with high accuracy for helping the construction team visualize the exact boundaries and layout as per the design specifications. The process is repeated as the device moves to other sections of the building and continuously placing the markers at different locations to facilitate the construction of walls, and other interior spaces.
[0049] An ultrasonic sensor 109 is incorporated with the body 101 for ensuring the accuracy and quality of construction, particularly in verifying the thickness of the walls that are being constructed within the user-defined areas. The ultrasonic sensor 109 allows to detect the thickness of walls as the device moves through the building site. The ultrasonic sensor 109 works in conjunction with the imaging unit 103 to capture the 3D map of the area and monitor various construction parameters in real-time for ensuring that the building process adheres to the specified design and quality standards.
[0050] The ultrasonic sensor 109 operates on the principle of sound wave reflection which is a non-destructive method of measuring thickness. The ultrasonic sensor 109 emits high-frequency sound waves that travel through the wall material. When the sound waves hit the wall, they are reflected back to the sensor 109. The time taken for the sound waves to travel to the wall and back is measured, and from this time delay, the sensor 109 calculates the thickness of the wall. The accuracy of this measurement depends on the sensor's ability to send and receive sound waves efficiently, even when the wall material is dense or irregular. The microcontroller embedded processes the data obtained from the ultrasonic sensor 109 and compares it to predefined parameters. These parameters are set by the user based on the specific requirements of the building project. For example, certain areas require walls to meet a minimum thickness threshold to support interior spaces, insulation, or other structural components. If the ultrasonic sensor 109 detects that the thickness of the wall is below the specified threshold, this information is transmitted to the microcontroller, which then triggers an alert. This alert is communicated to the user via the computing unit, which is wirelessly connected to the device.
[0051] The user interface on the computing unit displays this alert, notifying the construction team or the project manager that a certain section of the wall does not meet the required thickness. This notification serves as a real-time diagnostic tool for allowing the construction team to take immediate corrective action before proceeding with the construction. The alert ensures that the building project is not compromised by errors in wall construction that lead to structural weaknesses, insulation issues, or other long-term problems. By continuously monitoring the wall thickness as the body 101 moves through the building site, the device helps maintain quality control throughout the construction process in view of allowing for more efficient oversight and timely adjustments.
[0052] In addition to wall thickness, the ultrasonic sensor 109 with the imaging unit 103 enable the device to correlate spatial data with physical attributes. As the imaging unit 103 captures images and generates the 3D map of the building's interior, the ultrasonic sensor 109 scan multiple sections of the walls in real time. This combined data stream provides a comprehensive view of both the visual layout and the structural integrity of the walls. This enables the user to see how the wall’s thickness aligns with the design specifications while simultaneously viewing the construction site from a spatial perspective. The imaging unit’s ability to capture multiple angles and provide a complete 3D representation enhances the overall accuracy of the measurements provided by the ultrasonic sensor 109. In case the detected thickness is insufficient, the microcontroller provides recommendations for corrective actions. For example, the microcontroller suggests adding more material or adjusting construction techniques to achieve the required wall thickness.
[0053] Lastly, a battery (not shown in figure) is associated with the device to supply power to electrically powered components which are employed herein. The battery is comprised of a pair of electrodes named as a cathode and an anode. The battery uses a chemical reaction of oxidation/reduction to do work on charge and produce a voltage between their anode and cathode and thus produces electrical energy that is used to do work in the device.
[0054] The present invention works best in the following manner, where the cuboidal body 101 positioned on the ground surface in under construction building as disclosed in the proposed invention. Initially, the user interacts with the user-interface on the computing unit, wirelessly connected to the device. The user inputs commands to generate the three-dimensional map of the building and specifies the location to be designed. The microcontroller, wirelessly linked to the computing unit via communication modules like Wi-Fi, Bluetooth, or GSM, processes these commands and activates the GPS module to determine the device's real-time location, calculating the optimal route to the user-defined location. The microcontroller then actuates motorized tracked wheels 102 to autonomously move the device along the determined route. Once at the target location, the AI-based imaging unit 103, mounted on the robotic arm 104, captures multiple images of the area to create the 3D map, which is displayed on the computing unit for the user to review. The holographic projection unit 105 enables the user to visualize the 3D map as the hologram. The device also records environmental noise levels using the microphone 106, and based on the noise, the microcontroller suggests suitable interior spaces for various sections. The microcontroller then directs the telescopically operated gripper 108 to place plastic markers in predefined sections and guiding the construction of interior spaces. The ultrasonic sensor 109 detects wall thickness and alerts the user if the thickness falls below required standards.
[0055] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) A designing assistive device for buildings, comprising:
i) a cuboidal body 101 positioned on a ground surface in an under construction building, wherein a user-interface inbuilt in a computing unit is wirelessly associated with said device for enabling a user to give input commands for preparing a three-dimensional map of said building along with specifying location of said building to be designed;
ii) a microcontroller wirelessly linked with said computing unit that processes said input commands and activates a GPS (Global Positioning System) module installed on said body 101 for detecting real-time location of said body 101, in accordance to which said microcontroller determines a route to be followed by said body 101 for reaching said user-defined location, wherein said microcontroller actuates motorized tracked wheels 102 arranged underneath said body 101 for maneuvering and body 101 along said determined route to reach said user-defined location;
iii) an artificial intelligence-based imaging unit 103 paired with a processor installed on said body 101 via a robotic arm 104 for capturing multiple images of said user-defined area of said building, respectively, for preparing a 3-D map, wherein said 3-D map is displayed on said computing unit for allowing said user to mark sections in said area where different interior spaces are to be built and different interiors are to be arranged, wherein said robotic arm 104 moves said imaging unit 103 in multiple directs to allow capturing of images of entire area;
iv) a microphone 106 positioned on said body 101 that is activated by said microcontroller for recording noise in surroundings in different sections of said area, wherein based on noise level, said microcontroller suggests different type of interior spaces along with appropriate construction suitable for different sections of said area;
v) a chamber 107 stored with multiple plastic marking units, installed with said body 101, wherein said microcontroller directs said wheels 102 for maneuvering said housing to said user-defined sections, wherein said microcontroller actuates a telescopically operated gripper 108 arranged with said body 101 for placing said markings at said user-defined sections to allow a concerned person to check said marking and construct said interiors spaces accordingly; and
vi) an ultrasonic sensor 109 arranged with said body 101 that works in synchronization with said imaging unit 103 for detecting thickness of walls constructed in said area, wherein in case said detected thickness recedes a threshold value, as per said interior spaces to be constructed, then said microcontroller sends an alert on said computing unit regarding inappropriate thickness of said walls.
2) The device as claimed in claim 1, wherein a holographic projection unit 105 is installed on said body 101 that is activated by said microcontroller for projecting a hologram of said prepared 3-D map to allow said user to get an idea how said area would look upon construction.
3) The device as claimed in claim 1, wherein said microcontroller is wirelessly linked with said computing unit via a communication module which includes, but not limited to Wi-Fi (Wireless Fidelity) module, Bluetooth module, GSM (Global System for Mobile Communication) module.
4) The device as claimed in claim 1, wherein said telescopically operated gripper 108 is powered by a pneumatic unit that includes an air compressor, air cylinder, air valves and piston which works in collaboration to aid in extension and retraction of said gripper 108.
5) The device as claimed in claim 1, wherein a battery is associated with said device for supplying power to electrical and electronically operated components associated with said device.
6) The device as claimed in claim 1, wherein said interior spaces refers to kitchen, bedroom, dining hall, drawing room, washroom.
| # | Name | Date |
|---|---|---|
| 1 | 202421094496-STATEMENT OF UNDERTAKING (FORM 3) [01-12-2024(online)].pdf | 2024-12-01 |
| 2 | 202421094496-REQUEST FOR EXAMINATION (FORM-18) [01-12-2024(online)].pdf | 2024-12-01 |
| 3 | 202421094496-REQUEST FOR EARLY PUBLICATION(FORM-9) [01-12-2024(online)].pdf | 2024-12-01 |
| 4 | 202421094496-POWER OF AUTHORITY [01-12-2024(online)].pdf | 2024-12-01 |
| 5 | 202421094496-FORM-9 [01-12-2024(online)].pdf | 2024-12-01 |
| 6 | 202421094496-FORM FOR SMALL ENTITY(FORM-28) [01-12-2024(online)].pdf | 2024-12-01 |
| 7 | 202421094496-FORM 18 [01-12-2024(online)].pdf | 2024-12-01 |
| 8 | 202421094496-FORM 1 [01-12-2024(online)].pdf | 2024-12-01 |
| 9 | 202421094496-FIGURE OF ABSTRACT [01-12-2024(online)].pdf | 2024-12-01 |
| 10 | 202421094496-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [01-12-2024(online)].pdf | 2024-12-01 |
| 11 | 202421094496-EVIDENCE FOR REGISTRATION UNDER SSI [01-12-2024(online)].pdf | 2024-12-01 |
| 12 | 202421094496-EDUCATIONAL INSTITUTION(S) [01-12-2024(online)].pdf | 2024-12-01 |
| 13 | 202421094496-DRAWINGS [01-12-2024(online)].pdf | 2024-12-01 |
| 14 | 202421094496-DECLARATION OF INVENTORSHIP (FORM 5) [01-12-2024(online)].pdf | 2024-12-01 |
| 15 | 202421094496-COMPLETE SPECIFICATION [01-12-2024(online)].pdf | 2024-12-01 |
| 16 | Abstract.jpg | 2024-12-27 |
| 17 | 202421094496-FORM-26 [03-06-2025(online)].pdf | 2025-06-03 |