Sign In to Follow Application
View All Documents & Correspondence

Archaeological Excavation System For Artifact Scanning And Restoration Guidance

Abstract: An archaeological excavation system for artifact scanning and restoration guidance, comprising a mobile body 101 with omnidirectional wheels 102 for site navigation, an ultrasonic sensor detects human presence, an artificial intelligence-based imaging unit 103 for facial recognition, a ground-penetrating radar (GPR) scans for artifacts, displaying depth and location on a touchscreen display 104, a 3D holographic projection unit 105 provides excavation guides, a haptic feedback gloves 106 offer real-time force guidance, an artifact scanning module with LiDAR and an AI classifier performs 360-degree artifact analysis, a robotic arm 109 and ultrasonic tank 111 gently clean artifacts, an X-ray fluorescence (XRF) sensor analyzes material, while machine learning estimates age and provides restoration suggestions on the display 104, a GPS for geo-tagging, a storage chamber 114 for storing tools.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
30 May 2025
Publication Number
25/2025
Publication Type
INA
Invention Field
PHYSICS
Status
Email
Parent Application

Applicants

Marwadi University
Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.

Inventors

1. Dr. Madhu Shukla
Head of the Department, Department of Computer Science and Engineering - Artificial Intelligence, Machine Learning, Data Science, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
2. Simrin Fathima Syed
Department of Computer Science and Engineering - Artificial Intelligence, Machine Learning, Data Science, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
3. Vipul Ladva
Department of Computer Science and Engineering - Artificial Intelligence, Machine Learning, Data Science, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
4. Dr. Nishant Kothari
Head of the Department, Department of Electrical Engineering, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
5. Akshay Ranpariya
Department of Computer Science and Engineering - Artificial Intelligence, Machine Learning, Data Science, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.
6. Neel Dholakia
Department of Computer Science and Engineering - Artificial Intelligence, Machine Learning, Data Science, Marwadi University, Rajkot - Morbi Road, Rajkot 360003 Gujarat, India.

Specification

Description:FIELD OF THE INVENTION

[0001] The present invention relates to an archaeological excavation system for artifact scanning and restoration guidance that is capable of enhancing efficiency, precision, and safety in archaeological site operations through technological integration.

BACKGROUND OF THE INVENTION

[0002] Archaeological excavation is a meticulous and often challenging process, critical for understanding human history and culture. The accurate identification, careful recovery, and proper preservation of artifacts are paramount to successful archaeological endeavors. Existing methods frequently rely on manual labor, traditional tools, and subjective interpretation, which can be time-consuming, prone to human error, and potentially damaging to delicate artifacts or the surrounding stratigraphy. Furthermore, the analysis and restoration phases, typically performed off-site and often requiring specialized expertise, can be lengthy and resource-intensive, leading to delays in documentation and public access to discovered heritage. There is a continuous need for innovative solutions that can integrate advanced technologies to improve the efficiency, precision, and safety of archaeological fieldwork, from initial site assessment to the post-excavation analysis and conservation guidance.

[0003] Traditionally, archaeological fieldwork involves manual probing, digging, and sifting, with artifact detection often relying on visual cues or basic metal detectors. Excavation depth and artifact location are typically recorded manually or with basic surveying tools, which can lack high-resolution accuracy. Cleaning and initial identification of artifacts are often performed by hand, carrying risks of damage. Comprehensive material analysis and precise dating typically require laboratory equipment and specialized personnel, detaching these processes from the immediate excavation environment. The lack of real-time, integrated guidance for excavation techniques, force application, and on-site restoration suggestions presents significant challenges for both experienced archaeologists and trainees, often leading to inconsistencies in practice and potential missed opportunities for immediate preservation measures.

[0004] US9290905B1 discloses a remote excavator tool fastens to a robotic arm on a remotely controlled robotic platform that includes a track drive. The tool uses high speed tilling elements rotating at about 1500 rpm to dig, efficiently, a trench using a small amount of power. The tilling elements are hardened steel, rotating counterclockwise to a conventional tiller. The tilling elements are symmetrically mounted on a polygonal shaft, and include right and left multiple couples of paired facing disks with staggered curved tines, where the tines are thick and have tapered hardened edges. Round brushes are interspaced between couples. The loosen soil is pushed forward and to the sides to help protect the robotic platform and maintain control of the tool especially as the rate of the excavation partially depends on the characteristics of the material being excavated.

[0005] US10060097B2 discloses an excavation system is disclosed for use with an excavation machine having a work tool and with an IPCC. The excavation system may have a location device configured to generate a first signal indicative of a location of the excavation machine, a display, and at least one controller in communication with the location device and the display. The controller may be configured to receive a second signal indicative of a location of the IPCC, and to cause representations of the excavation machine and the IPCC to be simultaneously shown on the display based on the first and second signals. The at least one controller may also be configured to determine a swing radius of the work tool, and to selectively cause an indication of alignment between the IPCC and the swing radius to be shown on the display based on the first signal, the second signal, and the swing radius.

[0006] Conventionally, many systems are disclosed and utilized in archaeological investigation, but these existing systems often present limitations in terms of automation, integration, and real-time guidance. While some specialized tools exist for subsurface scanning or laboratory analysis, but they often lack full integration into a unified, mobile system capable of orchestrating the entire excavation-to-restoration workflow.

[0007] In order to overcome the aforementioned drawbacks, there exists a need in the art to develop a system that requires to offer a fully integrated, mobile, and intelligent solution for archaeological excavation, artifact scanning, and restoration guidance. In addition, the developed system also needs to address these needs by providing a comprehensive system capable of enhancing efficiency, precision, and safety throughout the entire archaeological process, from initial site assessment to post-excavation analysis and conservation.

OBJECTS OF THE INVENTION

[0008] The principal object of the present invention is to overcome the disadvantages of the prior art.

[0009] An object of the present invention is to provide a system significantly improves precision and efficiency of artifact detection and excavation through the use of advanced sensing technologies and real-time data display.

[0010] Another object of the present invention is to develop a system that offers comprehensive, real-time guidance and training for individuals involved in archaeological excavation, ensuring proper techniques, optimal tool selection, and minimizing risk of artifact damage through haptic feedback and holographic projections.

[0011] Yet another object of the present invention is to enable on-site, automated cleaning, analysis, and preliminary restoration guidance for recovered artifacts, leveraging robotics, specialized cleaning processes, and AI-driven material identification and age estimation.

[0012] The foregoing and other objects, features, and advantages of the present invention will become readily apparent upon further review of the following detailed description of the preferred embodiment as illustrated in the accompanying drawings.

SUMMARY OF THE INVENTION

[0013] The present invention relates to an archaeological excavation system for artifact scanning and restoration guidance capable of enhancing the efficiency, precision, and safety of archaeological digs, providing real-time data, and offering intelligent support for artifact handling and preliminary restoration.

[0014] According to an embodiment of the present invention, an archaeological excavation system for artifact scanning and restoration guidance, comprises of a mobile body structured to be positioned on a surface of an archaeological site, wherein multiple omnidirectional wheels are arranged underneath the body to provide flexible movement in all directions, an ultrasonic sensor installed on the body to detect the presence of individuals in proximity to the body, upon successful detection a microcontroller linked with the ultrasonic sensor activates an artificial intelligence-based imaging unit installed on the body and integrated with a facial recognition protocol, a ground-penetrating radar (GPR) mounted on the body to scan the surface of the site for potential artifacts, wherein the GPR generates a heat map displayed on a touchscreen display installed on the body, providing depth and location data of detected artifacts to the individual(s), a 3D holographic projection unit positioned on the body to provide detailed visual guides for excavation techniques, including proper tool selection and artifact dating based on soil depth, and a pair of gloves associated with the system and adapted to be worn by the individual(s), the gloves are equipped with pressure sensors and haptic feedback units, providing real-time feedback on the excavation force applied and guiding the trainee in correct excavation methods.

[0015] According to another embodiment of the present invention, an artifact scanning and identification module for artifact scanning and restoration guidance, comprising a rotating platform and a motorized dual-axis slider, integrated with a LiDAR (Light Detection and Ranging) sensor and synced with the imaging unit, for 360-degree scanning and classification of recovered artifacts based on material composition and historical data, a robotic arm with a soft bristle brush is arranged within the platform, the microcontroller actuates the arm to gently clean the artifact to remove loose dirt and debris, an ultrasonic cleaning tank arranged adjacent to the platform, the tank comprising a robotic gripper with a rubberized end for transferring the artifact from the platform into the tank, and a Peltier unit positioned within the tank for maintaining a predefined temperature suitable for the cleaning process, and an X-ray fluorescence (XRF) sensor mounted on the body for analyzing material composition of the cleaned artifact, the microcontroller further employs machine learning protocols and historical GPS-linked excavation data to estimate artifact age and classify the artifact, based on the classification and artifact condition, the microcontroller provides restoration suggestions that are further displayed on the touchscreen display for guidance, a GPS (Global Positioning System) module is integrated with the microcontroller for real-time geo-location tracking, geo-tagging, and mapping of excavation site, and a storage chamber is integrated within the body to house excavation tools, microcontroller instructs individual(s) to select the appropriate tool based on the artifact detection and excavation requirements.

[0016] While the invention has been described and shown with particular reference to the preferred embodiment, it will be apparent that variations might be possible that would fall within the scope of the present invention.

BRIEF DESCRIPTION OF THE DRAWINGS

[0017] These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings where:
Figure 1 illustrates an isometric view of an archaeological excavation system for artifact scanning and restoration guidance.

DETAILED DESCRIPTION OF THE INVENTION

[0018] The following description includes the preferred best mode of one embodiment of the present invention. It will be clear from this description of the invention that the invention is not limited to these illustrated embodiments but that the invention also includes a variety of modifications and embodiments thereto. Therefore, the present description should be seen as illustrative and not limiting. While the invention is susceptible to various modifications and alternative constructions, it should be understood, that there is no intention to limit the invention to the specific form disclosed, but, on the contrary, the invention is to cover all modifications, alternative constructions, and equivalents falling within the spirit and scope of the invention as defined in the claims.

[0019] In any embodiment described herein, the open-ended terms "comprising," "comprises,” and the like (which are synonymous with "including," "having” and "characterized by") may be replaced by the respective partially closed phrases "consisting essentially of," consists essentially of," and the like or the respective closed phrases "consisting of," "consists of, the like.

[0020] As used herein, the singular forms “a,” “an,” and “the” designate both the singular and the plural, unless expressly stated to designate the singular only.

[0021] The present invention relates to an archaeological excavation system for artifact scanning and restoration guidance that is capable of automating critical aspects of artifact detection, precise excavation guidance, comprehensive on-site analysis, and preliminary preservation. Additionally, the system pertains to a mobile, automated, and intelligent system designed to assist in the detection, excavation, cleaning, identification, classification, and restoration suggestion for archaeological artifacts.

[0022] Referring to Figure 1, an isometric view of an archaeological excavation system for artifact scanning and restoration guidance is illustrated, comprising a mobile body 101 installed with multiple omnidirectional wheels 102, an artificial intelligence-based imaging unit 103 installed on the body 101 and integrated with a facial recognition protocol, a touchscreen display 104 installed on the body 101, a 3D (three-dimensional) holographic projection unit 105 positioned on the body 101, a pair of glove 106 associated with the system.

[0023] Figure 1 further illustrates a rotating platform 107 and a motorized dual-axis slider 108 comprised on artifact scanning and identification module on the body 101, a robotic arm 109 with a soft bristle brush 110 mounted on the platform 107, an ultrasonic tank 111 arranged adjacent to the platform 107, a robotic gripper 112 with a rubberized end comprised on the tanks 111, a storage container 113 mounted on the body 101 and a storage chamber 114 integrated on the body 101.

[0024] The system disclosed herein, includes a mobile body 101 that is designed to be positioned on a surface of an archaeological site and serves as a central enclosure for all the operations of the system. The mobile body 101 structured to navigate the uneven terrain of the archaeological site. A multiple omnidirectional wheels 102 are arranged underneath of the body 101 to provide flexible movement in all directions. The omnidirectional wheel 102 functions through a specialized design that allows for movement along two axes simultaneously. The omnidirectional wheel 102 comprises of a main hub and a series of smaller, unpowered rollers positioned around its circumference. These rollers are aligned perpendicularly to the main wheel's rotational axis. When the main wheel 102 is driven forward or backward by its dedicated motor, the rollers passively spin, enabling to also slide laterally without skidding. These unique internal arrangement of a primary driven wheel 102 and freely rotating perpendicular rollers grants the omnidirectional wheel 102 its distinctive multi-directional movement capabilities, allowing the mobile body 101 to translate and rotate independently and simultaneously, thus providing highly flexible and precise maneuverability on the archaeological site.

[0025] A user activates the system for further operation, which is done upon a simple pressing of a push button installed on the body 101. The push button is typically made from polycarbonate. When push button is pressed by the user to switch on the system, it allows current to flow. This sends a signal to the microcontroller, instructing the microcontroller to activate the system. The microcontroller then powers up the system, enabling them to function.

[0026] Once the system powers up, the microcontroller activates an ultrasonic sensor installed on the body 101 to detect presence of individual in proximity of the body. The ultrasonic sensor operates by utilizing a piezoelectric transducer. The transducer, connected to a control circuit, generates high-frequency sound waves (ultrasound) when an electrical pulse is applied. These waves travel outward. If an object is detected, the sound waves reflect back as an echo, which the same transducer then converts back into an electrical signal. The control circuit accurately measures the "time of flight" - the duration between emitting the pulse and receiving the echo. Knowing the speed of sound, the sensor's internal processor calculates the precise distance to the object. The distance data is then relayed to the system's microcontroller for triggering further actions.

[0027] Upon successful detection of an individual in proximity to the body, the microcontroller activates an artificial intelligence-based imaging unit 103 installed on the body 101 to confirm presence of the individual. The Artificial Intelligence-based imaging unit 103 integrates a high-resolution camera sensor with a dedicated Image Signal Processor (ISP) and a specialized Neural Processing Unit (NPU). When the imaging unit 103 is activated by the microcontroller after ultrasonic detection, the camera captures raw image data. The ISP then processes this data, performing tasks like de-noising, color correction, and sharpening to produce a clear digital image. These images feed is then fed into the NPU, which hosts a pre-trained Convolutional Neural Network (CNN). For the facial recognition protocol, the CNN is designed to detect and localize faces within the image frame. It then extracts unique facial features like distances between key points and contours to create a faceprint. The faceprint is compared against a small, pre-registered database of authorized personnel. If a match is found, the NPU sends a signal back to the microcontroller, confirming the presence of the individual and allowing the system to transition to full operational mode.

[0028] In case the match is found, the microcontroller then activates a ground-penetrating radar (GPR) mounted on the robotic body 101 to scan the surface of the site for potential artifacts. The GPR operates by transmitting and receiving high-frequency electromagnetic (EM) pulses into the ground. The radar comprises a transmitter antenna, a receiver antenna, and a control unit with processing electronics. The control unit generates precise electrical pulses, which the transmitter antenna converts into EM waves and sends them into the subsurface. As these waves propagate, they encounter different materials like soil layers, rocks, buried artifacts and many more with varying dielectric properties and electrical conductivity. When a change in these properties occurs, a portion of the EM wave is reflected back to the surface. The receiver antenna detects these reflected signals, which are then sent back to the control unit. The control unit precisely measures the two-way travel time of these echoes. By analyzing the time delays and amplitudes of the reflections, the internal processor constructs a subsurface profile, allowing it to generate a heat map displayed on a touchscreen display 104 installed on the body 101 for indicating the depth and location data of detected artifacts to the individual.

[0029] The heat map is a data visualization where values are represented by colors. The heat map uses a color gradient, typically from cool to warm tones, to show intensity or density across a spatial arrangement, like a grid. These visual representation helps quickly identify patterns, trends, and areas of high or low concentration within complex datasets on the touchscreen display 104. The touchscreen displays 104 operates using projected capacitive (PCAP) technology, chosen for its durability and multi-touch capabilities, crucial for field use. Internally, a transparent conductive grid of electrodes is embedded beneath a protective glass. When the user's finger or conductive stylus touches the screen, it disrupts the electrical field of these electrodes, causing a measurable change in capacitance at that specific point. A dedicated touchscreen controller then detects these capacitance changes, triangulates the exact touch location, and translates it into digital signals. These signals are transmitted to the microcontroller, which interprets the touch input to display 104 ground-penetrating radar heat maps, provide restoration suggestions, or guide tool selection, offering a real-time, interactive interface for archaeologists.

[0030] A 3D (three-dimensional) (three0dimensional) holographic projection unit 105 mounted on the body 101 to provide detailed visual guides for excavation techniques, including proper tool selection and artifact dating based on soil depth. The projection unit 105 internally relies on principles of light diffraction and interference to create realistic, three-dimensional images in free space. The projection unit 105 employs a laser that is a coherent light source, whose beam is split into two paths: an object beam and a reference beam. The object beam is modulated by a Spatial Light Modulator (SLM), which dynamically controls the phase and amplitude of the light based on the 3D data of the visual guide excavation techniques, tool selection, artifact dating. The reference beam, unaltered, then interferes with the modulated object beam. This interference pattern, containing the depth and spatial information of the 3D guide, is precisely projected, often onto a specialized screen or a carefully designed optical setup. The human eye then perceives this reconstructed light field as a volumetric 3D image, appearing to float in front of the projection unit 105, providing immersive and detailed visual instructions to the archaeologist.

[0031] The system discussed herein also associated with a pair of glove 106 which is adapted to be worn by the individual(s) and are equipped with pressure sensors and haptic feedback units that provides real-time feedback on the excavation force applied and guiding the trainee in correct excavation methods. The pressure sensors are flexible resistive and capacitive sensors embedded at the fingertips within the glove’s fabric and key palm areas. When the user applies force during excavation, these sensors deform, causing a change in their electrical properties, these changes is measured by an internal circuit, quantified as applied pressure, and transmitted to the microcontroller.

[0032] Simultaneously, the haptic feedback units consist of miniature electromechanical actuators, most commonly Eccentric Rotating Mass (ERM) motors, embedded within the gloves 106. When the microcontroller determines that the user is applying too much force, or needs to adjust their excavation angle, then the microcontroller sends a precisely modulated electrical signal to these actuators. An ERM motor works by rapidly spinning an unbalanced weight, causing the motor itself to vibrate, and these vibration is then transmitted to the user's hand. The frequency and amplitude of these vibrations are carefully controlled to provide distinct tactile sensations, guiding the trainee through proper excavation methods and preventing artifact damage.

[0033] An artifact scanning and identification module is mounted on the body 101 to precisely capture the physical characteristics of a recovered artifact and then automatically classify the artifact based on comprehensive data analysis. The module’s internal workings combine mechanical precision with advanced sensing and artificial intelligence (AI). The module further includes and AI-based classification that analyzes 3D images captured by the imaging unit 103 and classifies recovered artifacts according to material type, such as but not limited to metal, bone, or ceramic, providing restoration suggestions based on the classification.

[0034] Furthermore, the artifact scanning and identification module integrates a rotating platform 107 and a motorized dual-axis slider 108 for precise and automated positioning of the artifact. The rotating platform 107 operates internally via a precisely controlled stepper motor. This motor receives commands from the system's microcontroller, allowing it to rotate the platform 107 in highly accurate, incremental steps. A gear reduction arrangement, typically involving gears, connects the motor to the platform 107. The gearing increases torque and reduces speed, ensuring smooth, stable, and controlled rotation, even for delicate artifacts. The platform 107 itself is mounted on low-friction bearings to facilitate effortless movement and maintain stability.

[0035] The motorized dual-axis slider 108 functions internally to provide precise linear movement along two independent axes X for horizontal and Y for vertical axis. Each axis typically employs a stepper motor coupled with lead screws. When the microcontroller sends electrical pulses to a stepper motor, it rotates in precise increments. This rotational motion is converted into linear motion by the screw, which engages with a nut attached to the sliding carriage. As the screw turns, the nut and carriage move along the axis. Linear guide rails ensure smooth, straight travel and prevent deflection, which allows the artifact, and the scanning sensors, to be moved with high accuracy and repeatability, ensuring comprehensive and precise scanning from all necessary perspectives.

[0036] A LiDAR (Light Detection and Ranging) sensor is integrated into the artifact scanning and identification module and synced with the imaging unit 103, for 360-degree scanning and classification of recovered artifacts based on material composition and historical data. The LiDAR sensor internally operates by emitting rapid pulses of laser light. These pulses travel outwards and strike the target the artifact. A highly sensitive optical receiver in the LiDAR sensor then detects the reflected laser light. The crucial step is the precise measurement of the time-of-flight, the minuscule time it takes for each laser pulse to travel from the sensor, hit the object, and return. Since the speed of light is constant and known, the sensor's internal processor uses this time-of-flight data to accurately calculate the distance to each point on the artifact's surface. By rapidly firing millions of pulses and measuring their individual return times, the LiDAR sensor builds a dense point cloud that is a collection of precise 3D coordinates (X, Y, Z) that collectively form a highly accurate digital representation of the artifact's shape and surface.

[0037] After scanning the artifact, the microcontroller actuates a robotic arm 109 with a soft bristle brush 110 installed on the platform 107 to gently clean, remove loose dirt and debris from the artifact. Each joint of the robotic is actuated by a dedicated servo motor. These motors are precisely controlled by electrical signals from the microcontroller, which dictates their exact angle and speed. To ensure accuracy, each joint is equipped with optical encoders, which provide real-time feedback on the joint's current position to the microcontroller, forming a closed-loop control arrangement. The feedback allows the microcontroller to constantly monitor and adjust the arm's trajectory and position with high precision. The soft bristle brush 110 at the end of the arm 109 is driven by its own small motor, allowing for independent rotation, while integrated pressure sensors at the brush's contact point provide additional feedback to the microcontroller, enabling bristles to apply just the right amount of gentle pressure for cleaning delicate artifacts without causing damage.

[0038] A robotic gripper 112 with a rubberized end comprised on an ultrasonic cleaning tank 111 arranged adjacent to the platform 107 for transferring the artifact from the platform 107 into the tank 111. The robotic gripper 112 typically utilizes a servo motor connected to a parallel jaw linkage. When the microcontroller sends a signal, the motor rotates, and the rotational motion is converted into linear motion by the linkage, causing the gripper's jaws to open and close. The rubberized end ensures a high coefficient of friction and a gentle, non-damaging grip on the artifact. Force sensors are also integrated to provide feedback to the microcontroller on the gripping force applied or the jaws' position, allowing for adaptive and precise control to prevent crushing fragile artifacts while ensuring a secure hold for transfer into the ultrasonic cleaning tank 111.

[0039] The ultrasonic cleaning tank 111 works internally by harnessing the phenomenon of cavitation to gently yet effectively remove dirt and debris from artifacts. At its core, an ultrasonic generator converts standard electrical power into a high-frequency electrical signal typically in the kilohertz range. This high-frequency signal is then sent to transducers, which are typically piezoelectric units mounted to the of the stainless steel tank 111. The transducers convert the electrical energy into high-frequency mechanical vibrations. These vibrations propagate through the cleaning solution within the tank 111, creating alternating high and low-pressure cycles. During the low-pressure cycles, millions of microscopic vacuum bubbles rapidly form. As the high-pressure cycles immediately follow, these bubbles violently collapse. The implosion generates localized scrubbing action and microscopic jets of liquid that dislodges contaminants from the artifact's surface, even in tiny crevices.

[0040] A storage container 113 housing multiple cleaning solutions are mounted on the body, the microcontroller selects and dispenses an appropriate cleaning solution through conduit pipes into the tank 111 based on the artifact’s material type as determined by the imaging unit 103. The storage container 113 is a compartmentalized unit designed to hold various specialized cleaning agents, each optimized for different artifact materials. Each compartment is connected to a dedicated conduit pipe that leads to the ultrasonic cleaning tank 111. Under the control of the microcontroller, which has determined the artifact's material type via the imaging unit 103, the system activates a specific pump associated with the appropriate cleaning solution. Simultaneously, a corresponding valve opens, allowing that particular solution to be precisely drawn from its dedicated compartment and dispensed through its conduit pipe directly into the ultrasonic cleaning tank 111. This internal process ensures that the correct cleaning agent is automatically selected and delivered in a controlled manner, preventing cross-contamination and optimizing the cleaning process for each unique artifact.

[0041] A Peltier unit is also integrated within the tank 111, which maintains a predefined temperature suitable for the cleaning process. The Peltier unit comprises a series of alternating P-type and N-type semiconductor pellets sandwiched between two ceramic plates. When a DC electric current is applied, it causes electrons to move from a lower energy level to a higher energy level at one junction, absorbing heat from that side, making it cold. Simultaneously, at the opposite junction, electrons move from a higher to a lower energy level, releasing heat to that side, making it hot. By carefully controlling the direction and magnitude of the current, the Peltier unit actively pumps heat from one side to the other, enabling precise temperature maintenance for the cleaning solution in the ultrasonic tank 111.

[0042] An X-ray fluorescence (XRF) sensor mounted on the body 101 for analyzing material composition of the cleaned artifact. The XRF sensor works internally by using an X-ray tube and a detector. The X-ray tube generates primary X-rays that bombards the artifact. The excitation causes inner-shell electrons within the artifact's atoms to be ejected. When outer-shell electrons fill these vacancies, they emit unique secondary X-rays, known as fluorescent X-rays, each with a characteristic energy specific to its parent element. The detector, typically a silicon-drift detector (SDD), captures these fluorescent X-rays then measures their energies and intensities. An internal processor then analyzes this spectral fingerprint to identify the elements present in the artifact and determine their relative concentrations, providing a non-destructive material composition analysis of the cleaned artifact.

[0043] Also, the microcontroller instructs individual(s) to select an appropriate tool based on the artifact detection and excavation form a storage chamber 114 integrated on the body 101 to house excavation tools including trowels, brushes, hand shovels, and more. The storage chamber 114 is basically a compartment for housing these excavation tools. The internal design of the storage chamber 114 includes custom-fitted slots and magnetic strips to secure tools and prevent rattling during movement

[0044] Once the XRF sensor delivers the artifact's elemental composition, and potentially visual data from the imaging unit 103 conveys its condition, these information becomes input for the microcontroller's embedded machine learning protocols. These protocols are not rigid rule-sets; rather, they have undergone extensive training on vast historical datasets. The imaging unit 103 is also helpful in assisting soil stratigraphy analysis and the data is used by the microcontroller to guide the individual through different soil layers and help to estimate the age of discovered artifacts. The imaging unit 103 aids soil stratigraphy analysis by capturing high-resolution photographs of exposed excavation profiles. The microcontroller processes these images using image analysis protocols, potentially enhanced by machine learning models trained on diverse soil characteristics. By analyzing visual cues like color variations, texture, and inclusions, the system internally differentiates distinct soil layers. The visual data allows the microcontroller to guide the individual through different depths, applying the principle of superposition to estimate the relative age of artifacts based on their position within the identified stratigraphic sequence. The training data is crucial, encompassing known artifacts with their verified material compositions, established ages, specific pottery typologies, metal alloys, and vital GPS-linked excavation data that contextualizes their discovery within geological strata and environmental conditions.

[0045] A GPS (Global Positioning System) module is integrated with the microcontroller for real-time geo-location tracking, geo-tagging, and mapping of excavation site. The GPS module works internally by receiving and processing signals from a constellation of Earth-orbiting satellites. The GPS module contains a receiver antenna to capture faint microwave radio signals continuously broadcast by at least four GPS satellites. Each signal contains precise timing information and the satellite's exact orbital position. The GPS module's internal processor then calculates the time-of-flight for each signal, determining the distance from the receiver to each satellite. Using a mathematical principle called trilateration, the processor triangulates these distances to pinpoint its own precise 3D location latitude, longitude, and altitude on the archaeological site. The real-time geo-location data is then sent to the microcontroller for geo-tagging artifacts and mapping the excavation site.

[0046] Lastly, a battery (not shown in figure) is associated with thesystem to supply power to electrically powered components which are employed herein. The battery is comprised of a pair of electrode named as a cathode and an anode. The battery uses a chemical reaction of oxidation/reduction to do work on charge and produce a voltage between their anode and cathode and thus produces electrical energy that is used to do work in the system.

[0047] The present invention works best in the following manner, where the mobile body 101, equipped with multiple omnidirectional wheels 102, is positioned on the archaeological site's surface, allowing for flexible movement in all directions. During operation, an ultrasonic sensor 103, continuously detects the presence of individuals in proximity to the body 101. Upon successful detection of an individual, a microcontroller linked with the ultrasonic sensor, activates the artificial intelligence-based imaging unit 103, integrated with a facial recognition protocol. Concurrently, the ground-penetrating radar (GPR), scans the surface of the site for potential artifacts. The GPR generates the heat map, which is displayed on a touchscreen display 104, providing depth and location data of detected artifacts to the individual(s). As the excavation proceeds, the 3D holographic projection unit 105, provides detailed visual guides for excavation techniques, including proper tool selection and artifact dating based on soil depth. For practical guidance during excavation, a pair of gloves 106, associated with the system, are adapted to be worn by the individual(s). These gloves 106 are equipped with pressure sensors and haptic feedback units, providing real-time feedback on the excavation force applied and guiding the trainee in correct excavation methods. Furthermore, a GPS (Global Positioning System) module, provides real-time geo-location tracking, geo-tagging, and mapping of the excavation site. To facilitate tool selection, the storage chamber 114, houses excavation tools including trowels, brushes, and hand shovels, wherein the microcontroller instructs individual(s) to select the appropriate tool based on the artifact detection and excavation requirements.

[0048] In continuation, upon recovery of an artifact, it is placed onto an artifact scanning and identification module. This module comprises a rotating platform 107 and a motorized dual-axis slider 108, and is integrated with a LiDAR (Light Detection and Ranging) sensor and synced with the imaging unit 103, for 360-degree scanning and classification of recovered artifacts based on material composition and historical data. An AI-based classification component within the artifact scanning and identification module analyzes 3D images captured by the imaging unit 103 and classifies recovered artifacts according to material type, such as metal, bone, or ceramic, providing initial restoration suggestions based on this classification. Within the rotating platform 107, a robotic arm 109 with a soft bristle brush 110 is arranged. The microcontroller actuates this robotic arm 109 to gently clean the artifact, removing loose dirt and debris. Following the initial cleaning, an ultrasonic cleaning tank 111, arranged adjacent to the rotating platform 107, is utilized. This tank 111 comprises a robotic gripper 112 with a rubberized end for transferring the artifact from the platform 107 into the tank 111. The Peltier unit is positioned within the tank 111 for maintaining a predefined temperature suitable for the cleaning process. The storage container 113 housing multiple cleaning solutions. The microcontroller selects and dispenses the appropriate cleaning solution from the container 113 through conduit pipes into the tank 111 based on the artifact’s material type as determined by the imaging unit 103. After cleaning, the X-ray fluorescence (XRF) sensor, analyzes the material composition of the cleaned artifact. The microcontroller further employs machine learning protocols and historical GPS-linked excavation data to estimate artifact age and classify the artifact. Based on this classification and the artifact's condition, the microcontroller provides detailed restoration suggestions, which are displayed on the touchscreen display 104 for guidance to the individual(s). The imaging unit 103 also assists in soil stratigraphy analysis, and this data is used by the microcontroller to guide the individual through different soil layers and help estimate the age of discovered artifacts.

[0049] Although the field of the invention has been described herein with limited reference to specific embodiments, this description is not meant to be construed in a limiting sense. Various modifications of the disclosed embodiments, as well as alternate embodiments of the invention, will become apparent to persons skilled in the art upon reference to the description of the invention. , Claims:1) An archaeological excavation system for artifact scanning and restoration guidance, comprising:

i) a mobile body 101 structured to be positioned on a surface of an archaeological site, wherein multiple omnidirectional wheels 102 are arranged underneath said body 101 to provide flexible movement in all directions;
ii) an ultrasonic sensor installed on the body 101 to detect presence of individual in proximity to the body, wherein upon successful detection a microcontroller linked with the ultrasonic sensor activates an artificial intelligence-based imaging unit 103 installed on the body 101 and integrated with a facial recognition protocol
iii) a ground-penetrating radar (GPR) mounted on said body 101 to scan the surface of the site for potential artifacts, wherein said GPR generates a heat map displayed on a touchscreen display 104 installed on the body 101, providing depth and location data of detected artifacts to the individual(s);
iv) a 3D holographic projection unit 105 positioned on said body 101 to provide detailed visual guides for excavation techniques, including proper tool selection and artifact dating based on soil depth;
v) a pair of glove 106 associated with the system and adapted to be worn by the individual(s), wherein said gloves 106 are equipped with pressure sensors and haptic feedback units, providing real-time feedback on the excavation force applied and guiding the trainee in correct excavation methods;
vi) an artifact scanning and identification module, comprising a rotating platform 107 and a motorized dual-axis slider 108, integrated with a LiDAR (Light Detection and Ranging) sensor and synced with the imaging unit 103, for 360-degree scanning and classification of recovered artifacts based on material composition and historical data;
vii) a robotic arm 109 with a soft bristle brush 110 is arranged within said platform 107, said microcontroller actuates said arm 109 to gently clean said artifact to remove loose dirt and debris, wherein an ultrasonic cleaning tank 111 arranged adjacent to said platform 107, said tank 111 comprising a robotic gripper 112 with a rubberized end for transferring said artifact from said platform 107 into said tank 111, and a Peltier unit positioned within said tank 111 for maintaining a predefined temperature suitable for said cleaning process; and
viii) an X-ray fluorescence (XRF) sensor mounted on said body 101 for analyzing material composition of said cleaned artifact, said microcontroller further employs machine learning protocols and historical GPS-linked excavation data to estimate artifact age and classify said artifact, wherein based on said classification and artifact condition, said microcontroller provides restoration suggestions, that are further displayed on said touchscreen display 104 for guidance.

2) The system as claimed in claim 1, wherein a GPS (Global Positioning System) module is integrated with said microcontroller for real-time geo-location tracking, geo-tagging, and mapping of excavation site.

3) The system as claimed in claim 1, wherein a storage chamber 114 is integrated within said body 101 to house excavation tools including trowels, brushes, hand shovels, and more, said microcontroller instructs individual(s) to select the appropriate tool based on the artifact detection and excavation requirements.

4) The system as claimed in claim 1, wherein artifact scanning and identification module further includes an AI-based classification that analyzes 3D images captured by the imaging unit 103 and classifies recovered artifacts according to material type, such as metal, bone, or ceramic, providing restoration suggestions based on said classification.

5) The system as claimed in claim 1, wherein said imaging unit 103 assist in soil stratigraphy analysis, and said data is used by said microcontroller to guide the individual through different soil layers and help estimate the age of discovered artifacts.

6) The system as claimed in claim 1, wherein a storage container 113 housing multiple cleaning solutions are mounted on the body, said microcontroller selects and dispenses an appropriate cleaning solution through conduit pipes into said tank 111 based on said artifact’s material type as determined by said imaging unit 103.

7) The system as claimed in claim 1, wherein a battery is associated with said system for supplying power to electrical and electronically operated components associated with said system.

Documents

Application Documents

# Name Date
1 202521052756-STATEMENT OF UNDERTAKING (FORM 3) [30-05-2025(online)].pdf 2025-05-30
2 202521052756-REQUEST FOR EXAMINATION (FORM-18) [30-05-2025(online)].pdf 2025-05-30
3 202521052756-REQUEST FOR EARLY PUBLICATION(FORM-9) [30-05-2025(online)].pdf 2025-05-30
4 202521052756-PROOF OF RIGHT [30-05-2025(online)].pdf 2025-05-30
5 202521052756-POWER OF AUTHORITY [30-05-2025(online)].pdf 2025-05-30
6 202521052756-FORM-9 [30-05-2025(online)].pdf 2025-05-30
7 202521052756-FORM FOR SMALL ENTITY(FORM-28) [30-05-2025(online)].pdf 2025-05-30
8 202521052756-FORM 18 [30-05-2025(online)].pdf 2025-05-30
9 202521052756-FORM 1 [30-05-2025(online)].pdf 2025-05-30
10 202521052756-FIGURE OF ABSTRACT [30-05-2025(online)].pdf 2025-05-30
11 202521052756-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [30-05-2025(online)].pdf 2025-05-30
12 202521052756-EVIDENCE FOR REGISTRATION UNDER SSI [30-05-2025(online)].pdf 2025-05-30
13 202521052756-EDUCATIONAL INSTITUTION(S) [30-05-2025(online)].pdf 2025-05-30
14 202521052756-DRAWINGS [30-05-2025(online)].pdf 2025-05-30
15 202521052756-DECLARATION OF INVENTORSHIP (FORM 5) [30-05-2025(online)].pdf 2025-05-30
16 202521052756-COMPLETE SPECIFICATION [30-05-2025(online)].pdf 2025-05-30
17 Abstract.jpg 2025-06-18
18 202521052756-FORM-26 [01-07-2025(online)].pdf 2025-07-01