Sign In to Follow Application
View All Documents & Correspondence

A Method And A System For Analysing Sports Data

Abstract: The disclosure provides a method and a system for analysing sports data. The method comprises collecting sports data by using one or more sensors. The method may include filtering the collected sports data. The method further includes, identifying type of shot from the collected sport data wherein the type of shot comprises at least a vertical, horizontal, defensive or attacking. Further, the method also includes performing classification of the shot wherein the classification comprises at least a drive, cut, pull, hook, sweep, reverse sweep, or paddle sweep.

Get Free WhatsApp Updates!
Notices, Deadlines & Correspondence

Patent Information

Application #
Filing Date
06 March 2023
Publication Number
11/2023
Publication Type
INA
Invention Field
COMPUTER SCIENCE
Status
Email
ramak@kreativenerzie.com
Parent Application

Applicants

QUICK LOGI TECHNOLOGIES INDIA PRIVATE LIMITED
D-2 West, Trinity Acres Sarjapur Road Karnataka Bangalore INDIA, Pin 560035

Inventors

1. Arminderpal Singh Thind
Villa no. 10, Saiven Caesars Palace, Burudhukunte Road, Volagerekallahalli, Sarjapur Road, Karnataka Bangalore 562125 (IN)
2. Ishwinderpal Singh Thind
Villa no. 10, Saiven Caesars Palace, Burudhukunte Road, Volagerekallahalli, Sarjapur Road, Karnataka Bangalore 562125 (IN)
3. Mahesha Godekere Siddalingaiah
Villa no. 10, Saiven Caesars Palace, Burudhukunte Road, Volagerekallahalli, Sarjapur Road, Karnataka Bangalore 562125 (IN)

Specification

Description:TECHNOLOGICAL FIELD
[0001] The present disclosure generally relates to sport accessories, and more particularly relates to a system and a method for analysing sports data.
BACKGROUND
[0002] Cricket is a popular sport that originated in England and is now played in many countries around the world. It is a bat-and-ball game played between two teams of 11 players each. The objective of the game is to score more runs than the opposing team while also taking all 10 of the opposing team's wickets.
[0003] At the beginning of the game, a coin toss is held to decide which team will bat first. The team that wins the toss can either choose to bat first or bowl first. The team that bats first sends two batsmen out to the field to face the first ball, while the opposing team sends out their bowler to bowl the first over.
[0004] The batting team scores runs by hitting the ball and running back and forth between two sets of wickets, which are located at opposite ends of the rectangular playing field, known as the pitch. If the ball is hit to the boundary of the field, the batting team scores four runs, and if it is hit over the boundary without touching the ground, the team scores six runs.
[0005] The fielding team tries to prevent the batting team from scoring runs by fielding the ball and trying to get the batsmen out. The most common way to get a batsman out is by hitting the wickets with the ball, known as a "bowled" dismissal. Other ways include catching the ball after the batsman hits it, "run out" when a fielder throws the ball and hits the wickets while a batsman is attempting to run, and more.
[0006] Once a team loses all 10 wickets or reaches the end of their allotted overs, the teams switch roles, with the fielding team becoming the batting team and vice versa. The team with the most runs at the end of the game wins. A typical game of cricket can last anywhere from a few hours to several days, depending on the format of the game.
[0007] Cricket has seen the adoption of various technologies to improve the accuracy of decision-making by umpires and enhance the viewing experience of fans. Some of the technologies used in cricket include:
[0008] Decision Review System (DRS): This system uses ball-tracking and edge-detection technology to help umpires make more accurate decisions regarding dismissals, such as lbw (leg before wicket) and caught behind. Each team is allowed to challenge a limited number of decisions per innings.
[0009] Hawk-Eye: This technology tracks the trajectory of the ball and predicts its path, which helps umpires make decisions on lbw appeals and gives fans a better view of close decisions.
[0010] Snickometer: This technology uses audio sensors and cameras to detect faint edges off the bat, which helps umpires make decisions on caught-behind appeals.
[0011] Hotspot: This technology uses infrared cameras to detect whether the ball has made contact with the bat or pad, which helps umpires make decisions on lbw appeals and caught-behind appeals.
[0012] LED stumps and bails: These are electronic versions of the traditional wooden stumps and bails. They light up when the ball hits them, which helps umpires make decisions on run-outs and stumpings.
[0013] Overall, the use of technology in cricket has improved the accuracy of decision-making and added to the excitement of the game.
[0014] Batting is one of the two main disciplines in the game of cricket, the other being bowling. Batting is the act of hitting the ball with a cricket bat in order to score runs for the batting team.
[0015] In cricket, there are two batsmen on the field at any given time, each taking turns to face the bowler. The batsman's primary objective is to score runs, while also trying to avoid getting out. The batsman scores runs by hitting the ball with the bat and running between the two ends of the pitch. A run is completed each time the batsmen successfully cross each other's end of the pitch.
[0016] Batting involves a combination of skill, technique, and strategy. A good batsman must have quick reflexes, excellent hand-eye coordination, and the ability to judge the trajectory and pace of the ball. Batsmen also use a variety of techniques, such as footwork and shot selection, to score runs and avoid getting out.
[0017] Some common shots in cricket include the defensive shot, the drive, the pull, and the cut. Batsmen also need to be able to read the bowler's delivery and adjust their shot accordingly. The skill of batting is highly valued in cricket, and many of the greatest cricketers of all time are known for their ability with the bat.
[0018] There are several technologies that are used to help cricketers improve their batting skills. Some of these technologies include:
[0019] Bowling machines: These machines simulate the delivery of a cricket ball at different speeds and angles, allowing batsmen to practice their shots without the need for a bowler. Bowling machines are often used to help batsmen develop their technique and improve their reflexes.
[0020] Video analysis: Video technology is used to record a batsman's innings, which can then be analyzed by coaches and the player themselves to identify areas for improvement. Video analysis can also help players understand their strengths and weaknesses and make adjustments to their technique.
[0021] Performance tracking sensors: Wearable sensors can be used to track a batsman's performance, including the speed of their bat swing, the angle of their bat, and the amount of force they apply to the ball. This data can be used to identify areas for improvement and monitor progress over time.
[0022] Virtual reality: Virtual reality technology is used to simulate match situations, allowing batsmen to practice their shots against different types of bowling in a safe and controlled environment. This technology can help batsmen develop their decision-making skills and improve their ability to read the ball.
[0023] Some sports equipment manufacturers have developed "smart" bats that include sensors to measure the quality of each play. This data can be used to help sports persons identify areas for improvement and make adjustments to their technique. However, similar technology is seldom available in cricket.
[0024] Overall, these technologies are used to help cricketers improve their batting skills and achieve better performance on the field.
[0025] Sports such as cricket is a professional game. Players participating in such game must remain fit for delivering performance to stay in the game. For improving their own performance, it is firstly important for the players to understand their own performance and also understand their strengths and weaknesses during each game. There are existing solutions in the market which utilize motion sensor/s to sense actions such as batting or bowling deliveries by a player. However, the existing technologies, which mostly work in controlled laboratory conditions, use bulky equipment and are expensive. However, these existing solutions are not always able to accurately detect each action of the player and thereby, resulting in false positives and false negatives. Due to the false positives and false negatives, the players does not get accurate data about their performance. In absence of such accurate data about the performance, the players do not know or understand their performance well and thus, are also unable to curate a better improvement plan for their performance.
[0026] Therefore, there is a need for a system and a method for automatically analysing sports data of a player. There is also a need for a system and a method for determining performance and improvement data of the player based on the analysis of the sports data.
BRIEF SUMMARY
[0027] Accordingly, there is a need for automatically analysing sports data of a player and determining performance as well as improvement data of the player based on the analysis of the sports data. In order to analysing sports data of the player, it is important to use one or more sensors to sense data of a player while playing and analyse the sensed data for curating a performance plan for a player.
[0028] Some example embodiments disclosed herein provide a system for analysing sports data. The system comprises one or more sensors to collect sports data and a filtering module to filter the collected sports data. The system also comprises an identification module to identify type of shot from the collected sport data, Further, the type of shot comprises at least a vertical, horizontal, defensive or attacking. The system further comprises a classification module to perform classification of the shot. And, the classification comprises at least a drive, cut, pull, hook, sweep, reverse sweep, or paddle sweep.
[0029] According to some example embodiments, one or more sensors comprise at least one of accelerometers, gyroscopes, magnetometers, piezoelectric sensors, electromagnetic trackers, flex, pressure, and optical sensors. In addition, GPS and LIDAR systems are also employed.
[0030] According to some example embodiments, the one or more sensors are attached to a wearable device wherein the wearable device comprises at least of a smart watch, smart band, and smart clothing. In another embodiment, the one or more sensors are attached or embedded in sports equipment.
[0031] According to some example embodiments, signal processing and time-series analysis are used to filter the collected sport data.
[0032] According to some example embodiments, the classification further comprising using at least a deep learning model.
[0033] According to some example embodiments, the deep learning model comprises at least of a recurrent neural network using at least a gated recurrent unit, attention-based long short-term memory, and convolutional neural network.
[0034] According to some example embodiments, the system further comprises an action detection module to detect actions from the collected sports data and discard a wrong action. According to some example embodiments, the system further comprises an action detection module to detect sports actions of interest at high accuracy rate from the collected sports data and discard sports actions which are not of interest.
[0035] According to some example embodiments, the action not of interest comprises at least of a tapping of bat on ground before playing a shot and swing of a bat while running between wickets.
[0036] According to some example embodiments, an action is detected from the collected data using the optical sensors. Also, the action detection comprises at least of a posture detection of a player, motion of at least a sports equipment, three dimensional co-ordinates of a body part of the player from the posture, eye focus of the player and hand-eye coordination of the player.
[0037] Some example embodiments disclosed herein provide a method for analysing sports data. The method comprises the steps of collecting sports data by using one or more sensors and filtering the collected sports data. The method also comprises the steps of identifying type of shot from the collected sport data wherein the type of shot comprises at least a vertical, horizontal, defensive or attacking. The method further comprises the steps of performing classification of the shot wherein the classification comprises at least a drive, cut, pull, hook, sweep, reverse sweep, paddle sweep, or stop (any other shot type?).
[0038] The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
BRIEF DESCRIPTION OF THE DRAWINGS
[0039] Having thus described example embodiments of the invention in general terms, reference will now be made to the accompanying drawings, which are not necessarily drawn to scale, and wherein:
[0040] FIG. 1A-1F illustrates a system architecture for analysing sports data, in accordance with different embodiments;
[0041] FIG. 2 illustrates an exemplary communication between a wearable device having an electronic circuitry attached or embedded thereto and a computing device, in accordance with an example embodiment;
[0042] FIG. 3 illustrates a block diagram of an electronic circuitry for analysing sports data, in accordance with an example embodiment;
[0043] FIG. 4 illustrates a block diagram for building and training an artificial intelligence (AI) model for performance analysis of a player, in accordance with an example embodiment;
[0044] FIG. 5 illustrates a block diagram of deep learning algorithms utilized for building and training the AI model, in accordance with an example embodiment;
[0045] FIG. 6 illustrates a flow diagram of a method for analysing sports data, in accordance with an example embodiment.
DETAILED DESCRIPTION
[0046] In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. It will be apparent, however, to one skilled in the art that the present disclosure can be practiced without these specific details. In other instances, systems, apparatuses, and methods are shown in block diagram form only in order to avoid obscuring the present disclosure.
[0047] Reference in this specification to “one embodiment” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. The appearance of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Further, the terms “a” and “an” herein do not denote a limitation of quantity, but rather denote the presence of at least one of the referenced items. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not for other embodiments.
[0048] Some embodiments of the present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all, embodiments of the invention are shown. Indeed, various embodiments of the invention may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will satisfy applicable legal requirements. Like reference numerals refer to like elements throughout. As used herein, the terms “data,” “content,” “information,” and similar terms may be used interchangeably to refer to data capable of being transmitted, received and/or stored in accordance with embodiments of the present invention. Thus, use of any such terms should not be taken to limit the spirit and scope of embodiments of the present invention.
[0049] The embodiments are described herein for illustrative purposes and are subject to many variations. It is understood that various omissions and substitutions of equivalents are contemplated as circumstances may suggest or render expedient but are intended to cover the application or implementation without departing from the spirit or the scope of the present disclosure. Further, it is to be understood that the phraseology and terminology employed herein are for the purpose of the description and should not be regarded as limiting. Any heading utilized within this description is for convenience only and has no legal or limiting effect.
[0050] Embodiments of the present disclosure may provide a system and a method for analysing sports data. The system and the method analysing sports data in such an improved manner are described with reference to FIG.1A to FIG. 6 as detailed below.
[0051] FIG. 1A illustrates a system architecture 100A for analysing sports data, in accordance with an example embodiment. A smart cricket bat 102 of a player 114 is shown. As can be seen, the smart cricket bat 102 may include a handle unit connected to a blade unit. Also, an electronic circuitry 104 may be removably attached or embedded/integrated with the handle unit of the smart cricket bat 102.
[0052] The electronic circuitry 104 may be capable of collecting and processing sports data of the player 114. Such sports data may include events or activities such as running, batting, vocal/audio communication, , and/or any such activity of the player 114. The electronic circuitry 104 may be capable of processing the collected sports data to filter out the noise or irrelevant data and then, process the sports data for performance analysis. The collecting and processing of sports data for performance analysis of the player 114 by the electronic circuitry 104 is explained in greater details in FIG. 2 to FIG. 6 below.
[0053] Moreover, the electronic circuitry 104 may also be in communication with a computing device 112 through a network 116. In this scenario, the electronic circuitry 104 may communicate or transmit the performance analysis and/or filtered sensor data of the player 114 to the computing device 112 for display or real-time updates and/or further processing to get performance analytics and deeper player performance insights. Such computing device 112 may belong to the player 114 or may belong to a cricket board/committee or may belong to a cricket team for which the player 114 is playing.
[0054] Further, the electronic circuitry 104 may be in communication with a server 108 through a network 106. Herein, the electronic circuitry 104 may communicate with the server 108 for transmitting the collected sports data to the server 108. In this exemplary embodiment, instead of the electronic circuitry 104, the server 108 processes the collected sports data to determine performance analysis of the player 114. In a preferred embodiment, some of the data processing is carried out at a mobile device.
[0055] Furthermore, the server 108 may be in communication with the computing device 112 through a network 110. The server 108 may process the collected data to determine the performance analysis of the player 114. Then, the server 108 may also be capable of transmitting the performance analysis of the player 114 to the computing device 112.
[0056] As used herein, the term “network” may refer to a long-term cellular network (such as GSM (Global System for Mobile Communication) network, LTE (Long-Term Evolution) network or a CDMA (Code Division Multiple Access) network) or a short-term network (such as Bluetooth network, Wi-Fi network, NFC (near-field communication) network, LoRaWAN, ZIGBEE or Wired networks (like LAN, el all) etc.).
[0057] As used herein, the term “computing device” may refer to a mobile phone, a personal digital assistance (PDA), a tablet, a laptop, a computer, VR Headset, Smart Glasses, or any such device capable of rendering performance analysis of the player 114.
[0058] As used herein, the term ‘electronic circuitry’ (as explained in FIG.2 below) may refer to (a) hardware-only circuit implementations (for example, implementations in analog circuitry and/or digital circuitry); (b) combinations of circuits and computer program product(s) comprising software and/or firmware instructions stored on one or more computer readable memories that work together to cause an apparatus to perform one or more functions described herein; and (c) circuits, such as, for example, a microprocessor(s) or a portion of a microprocessor(s), that require software or firmware for operation even if the software or firmware is not physically present. This definition of ‘circuitry’ applies to all uses of this term herein, including in any claims. As a further example, as used herein, the term ‘circuitry’ also includes an implementation comprising one or more processors and/or portion(s) thereof and accompanying software and/or firmware. As another example, the term ‘circuitry’ as used herein also includes, for example, a baseband integrated circuit or applications processor integrated circuit for a mobile phone or a similar integrated circuit in a server, a cellular network device, other network device, and/or other computing device.
[0059] FIG. 1B illustrates a system architecture 100B for analysing sports data, in accordance with an example embodiment. In this exemplary embodiment, the electronic circuitry 104 may be attached or embedded with clothing of the player 114. For an example, the electronic circuitry 104 may be attached or embedded to a t-shirt (i.e., upper half clothing) worn by the player 114.
[0060] FIG. 1C illustrates a system architecture 100C for analysing sports data, in accordance with an example embodiment. In this exemplary embodiment, the electronic circuitry 104 may be attached or embedded with the clothing of the player 114 at a different position. For an example, the electronic circuitry 104 may be attached or embedded to a pair of trousers or pants (i.e. lower half clothing) worn by the player 114.
[0061] FIG. 1D illustrates a system architecture 100D for analysing sports data, in accordance with an example embodiment. In this exemplary embodiment, the electronic circuitry 104 may be attached or embedded with a helmet worn by the player 114.
[0062] FIG. 1E illustrates a system architecture 100E for analysing sports data, in accordance with an example embodiment. As an example, the electronic circuitry 104 may be directly attached to (i.e. worn by) the player 114 on his/her wrist or hand. In another example, the electronic circuitry 104 may be directly attached to (i.e. worn by) the player 114 on his/her leg or thigh.
[0063] FIG. 1F illustrates a system architecture 100F for analysing sports data, in accordance with an example embodiment. In this exemplary embodiment, one or more imaging units (such as cameras) 120A-120B may be installed in a cricket field for capturing videos and images of the player 114. A person skilled in the art would not limit the camera units to be only on the field. The camera units may be a part of the mobile device 112 also. Such captured videos and images of the player 114 may be transmitted to the electronic circuitry 104 or to the server 108 for detecting posture detection of the player 114, motion of at least a sports equipment (such as bat 102), three dimensional co-ordinates of a body part of the player 114 from the posture, eye focus of the player 114 and hand-eye coordination of the player 114.
[0064] In some embodiments, the one or more imaging units (such as cameras) 120A-120B are installed or positioned in stumps of wickets. In some other embodiments, the one or more imaging units (such as cameras) 120A-120B are installed or positioned at different positions of the cricket field. Although FIG. 1F shows only two imaging units; however, using more than two imaging units or less than two imaging units is also within the scope of the present disclosure.
[0065] FIG. 2 illustrates an exemplary communication between a wearable device 118 having an electronic circuitry 104 attached or embedded thereto and a computing device 112, in accordance with an example embodiment. As shown in FIG. 2, the wearable device 118 may have the electronic circuitry 104. In an exemplary embodiment, the electronic circuitry 104 is removably attached with the wearable device 118. In another exemplary embodiment, the electronic circuitry 104 is embedded or integrated into the wearable device 118. Further, the wearable device 118 may be communicably coupled with the computing device 112 via a short-range network such as Bluetooth or NFC (near-field communication). This wearable device 118 may be worn by the player 114 either at his wrist or hand or thigh or leg while playing. In some embodiments, the wearable device 118 may also process the collected sports data for performance analysis of the player 114 and communicate the performance analysis of the player 114 directly to the computing device 112 via the short-range network.
[0066] Moreover, the wearable device 118 may include a camera, a microphone, LiDAR (Light Detection and Ranging) sensors and any such module or component well known in the art. As used herein, the wearable device 118 may refer to a smart watch, a smart band, a smart clothing or any such wearable device that is obvious to a person skilled in the art.
[0067] FIG. 3 illustrates a block diagram of an electronic circuitry 104 for analysing sports data, in accordance with an example embodiment. The electronic circuitry 104 may comprise, but is not limited to, an interface 302, a receiver 304, a transmitter 306, one or more sensors 308, an identification module 310, a classification module 312, a filtering module 314, a processor 316, an action detection module 318 and a memory 320.
[0068] The interface 302 may be configured to receive an input from the player 114 or any other user. The interface 302 may be configured to output performance analysis of the player 112.
[0069] The receiver 304 may be configured to receive any kind of information and/or data from the computing device 112 and/or the server 108. Such information and/or data may include notifications, performance analysis etc.
[0070] The transmitter 306 may be configured to transmit any kind of information and/or data to the computing device 112 and/or the server 108. Such information and/or data may include performance analysis etc.
[0071] The one or more sensors 308 may be configured to collect sports data of the player 114. As explained in FIG. 1 above, the sports data may include any activity or event related to the player 114, such as running, batting, bowling, wicket keeping, fielding, vocal/audio communication, breathing, heart rate, and/or any such activity of the player 114. The one or more sensors 308 may be comprise at least one of accelerometers, gyroscopes, magnetometers, piezoelectric sensors, electromagnetic trackers, heart-rate sensor, microphone, LiDAR (Light Detection and Ranging) , GPS, Flex Sensor and any such that is obvious to a person skilled in the art. For an example, the accelerometer may sense speed at which the player 114 is running between wickets, step count or bat speed while batting. Further, the bat speed may be linear or angular speed.
[0072] The one or more sensors 308 may be communicably coupled with the filtering module 314 to communicate the collected sports data of the player 114. The filtering module 314 may be configured to filter the collected sports data. Signal processing and time-series analysis is used to filter the collected sports data. That is, time-series techniques such as signal processing, time-series analysis, and statistical modelling are used to extract relevant features from the collected sports data to describe the motion pattern and frequency components. Also, time-series analysis techniques are used to extract features from the collected sports data, such as trend analysis, seasonal decomposition, and autocorrelation from the collected sports data.
[0073] There are various signal processing techniques that can be used to process motion sensor data. Some of the commonly used techniques include:
[0074] Filtering: Filtering is a technique used to remove unwanted noise or artifacts from sensor data. A popular type of filter used in motion sensor data processing is the Kalman filter, which can be used to estimate the true value of a signal by combining noisy measurements with a dynamic model.
[0075] Feature extraction: Feature extraction involves identifying relevant features or patterns in the sensor data. In motion sensor data processing, this might involve extracting features related to the frequency, amplitude, or direction of movement.
[0076] Time-frequency analysis: Time-frequency analysis techniques, such as the wavelet transform and Fourier transforms, can be used to analyze the spectral content of motion sensor data over time. This can be useful for identifying patterns or changes in movement over time.
[0077] Fourier transform is a mathematical technique that is commonly used to extract features from sensor data. The Fourier transform converts a time-domain signal into its frequency-domain representation, which can be useful for identifying patterns or features in the signal that may not be easily visible in the time-domain.
[0078] To extract features from sensor data using Fourier transform, the following steps are typically followed:
[0079] Pre-processing: The sensor data is typically pre-processed to remove noise, artifacts, or other unwanted signals that may interfere with the analysis.
[0080] Windowing: The sensor data is segmented into smaller sections, or windows, which are then analyzed separately. This helps to avoid issues related to spectral leakage or edge effects.
[0081] Fourier transform: The Fourier transform is applied to each window of sensor data, which results in a frequency-domain representation of the signal. This representation can be visualized using a power spectrum, which shows the distribution of power across different frequencies.
[0082] Feature extraction: Features can be extracted from the power spectrum to identify patterns or characteristics in the signal. For example, the frequency of the peak in the power spectrum may be used as a feature to identify a specific vibration frequency in the signal.
[0083] Classification: The extracted features can be used to classify the sensor data into different categories or classes.
[0084] Overall, Fourier transform is a powerful technique for analyzing sensor data and extracting features that can be used for a wide range of applications in fields such as engineering, healthcare, and environmental monitoring. Wavelet transform is a mathematical technique that is commonly used to extract features from sensor data. Unlike Fourier transform, which only provides frequency-domain information, wavelet transform provides both time-domain and frequency-domain information. This makes it a useful technique for analyzing non-stationary signals that vary in frequency and amplitude over time.
[0085] To extract features from sensor data using wavelet transform, the following steps are typically followed:
[0086] Pre-processing: The sensor data is typically pre-processed to remove noise, artifacts, or other unwanted signals that may interfere with the analysis.
[0087] Wavelet transform: The wavelet transform is applied to the sensor data, which results in a time-frequency representation of the signal. This representation can be visualized using a spectrogram, which shows the distribution of power across different frequencies and time.
[0088] Feature extraction: Features can be extracted from the spectrogram to identify patterns or characteristics in the signal. For example, the location and amplitude of specific wavelet coefficients may be used as features to identify a specific pattern in the signal.
[0089] Classification: The extracted features can be used to classify the sensor data into different categories or classes. For example, the presence of a specific pattern in the signal may be used to classify the health of a machine component.
[0090] In summary, wavelet transform is a powerful technique for analyzing sensor data and extracting features that can be used for a wide range of applications in fields such as engineering, healthcare, and environmental monitoring. It is particularly useful for analyzing non-stationary signals that may not be easily analyzed using Fourier transform.
[0091] Overall, signal processing techniques are used to extract meaningful information from motion sensor data, which can be used for a wide range of applications in fields such as sports, healthcare, and robotics.
[0092] Signal processing techniques are used to extract features from the collected sports data, such as filtering, Fourier transforms, or wavelet transforms. These techniques are used to extract features such as frequency, amplitude, phase, and energy from the collected sports data.
[0093] Such filtering may include removal of the noise from the collected sports data while retaining the relevant sports data. Noise that may be data redundant or not relevant for further processing is discarded at this stage. An example of such irrelevant sports data may be data between a time interval after expiry of an existing over and before start of a new over. Such filtering of the collected sports data at an initial stage results in efficient processing of the collected sports data and would move focus only on the relevant data.
[0094] In a preferred embodiment a peak recognition method is deployed after analyzing batting and bowling data. Peaks are identified on live data and when defined threshold limit is reached for a specific mode, then the data captured and pattern recognition algorithm is employed to identify it as a shot.
[0095] In some exemplary embodiments, the filtering of the collected sports data also include cleaning, formatting and normalizing the relevant sports data for inputting to the next stage of the identification module 310.
[0096] The one or more sensors 308 and the filtering module 314 may be communicably coupled with the action detection module 318 for communicating filtered sports data and collected sports data. The action detection module 318 may be configured to detect actions from the collected and/or filtered sports data. Based on detection of the actions of the player 114, the action detection module 318 may be configured to discard a wrong action. In some exemplary embodiments, the actions not of interest comprises at least of a tapping of a cricket bat 102 on ground before playing a shot and swing of the bat 102 while running between wickets.

[0097] The action detection module 318 may be configured to detect an action from the collected or flittered sports data using the optical sensors. In some exemplary embodiments, the action detection comprises at least of a posture detection of the player 114, motion of at least a sports equipment, three dimensional co-ordinates of a body part of the player 114 from the posture, eye focus of the player 114 and hand-eye coordination of the player 114. For an instance, based on a feed from a camera, posture of the player 114 may be detected and also co-ordinates of the body of the player 114 may also be detected. Using the camera feed, eye focus of the player 114 is also detected and determined if the focus of eyes lies in view of ball throw.
[0098] The filtering module 314 may be communicably coupled with the identification module 310 for communicating the filtered sports data to the identification module 310. The identification module 310 may be configured to identify type of shot from the sport data. In an exemplary embodiment, the type of shot comprises at least a vertical shot, a horizontal shot, a defensive shot or an attacking shot by the player 114. For identifying the type of shot, the identification module 310. For this, such shot may be pre-defined (in terms of their direction, angle etc.) and then, the collected sports data may be analyzed to identify the type of shot. For an instance, the “vertical shot” may be a shot when the bat 102 is perpendicular to the cricket ground when the bat 102 hits a ball. Similarly, the “horizontal shot” may be a shot when the bat 102 is on the same axis as to the cricket ground when the bat 102 hits a ball. Also, the “defensive shot” may be a shot when the player 112 pulls the bat 102 closer to himself when the ball hits. Further, the defensive shot is the shot when the intent of the player to keep the ball on ground without looking to score. The player tries to hit the ball with soft hands and bat face pointing towards ground. The defensive shot is played with a vertical bat. And, the “attacking shot” may be a shot when the player 112 pushes the bat 102 away from himself when the ball hits.
[0099] The identification module 310 may be communicably coupled with the classification module 312 for communicating the identified type of shot. The classification module 312 may be configured to perform classification of the identified shot. Further, the classification comprises at least a drive, a cut, a pull, a hook, a sweep, a reverse sweep, or a paddle sweep. For performing the classification of the identified shot, the classification module 312. Definition (such as angle, speed, strike rate etc.) for each of these drive, cut, pull, hook, sweep, reverse sweep, or paddle sweep may be pre-fed into the electronic circuitry 104 for classifying these shots. Based on the type of the identified shot, each shot is classified.
[00100] Herein, the classification further comprising using at least a deep learning model. In an exemplary embodiment, the deep learning model comprises at least of a recurrent neural network using at least a gated recurrent unit, attention-based long short-term memory, and convolutional neural network.
[00101] The memory 320 may be configured to store or save data and/or information of the player 114 such as performance analysis.
[00102] The processor 316 may be communicably coupled with the interface 302, the receiver 304, the transmitter 306, the one or more sensors 308, the identification module 310, the classification module 312, the filtering module 314, the action detection module 318 and/or the memory 320. Further, the interface 302, the receiver 304, the transmitter 306, the one or more sensors 308, the identification module 310, the classification module 312, the filtering module 314, the processor 316, the action detection module 318 and the memory 320 may also be communicably coupled with each other.
[00103] Moreover, each of the identification module 310, the classification module 312, the filtering module 314, and/or the action detection module 318 may be implemented as a hardware only module or a software only module or a combination of both hardware as well as software. Each of these modules 310, 312 and 314 may also be communicable coupled with a processor and a memory for performing operations and functions as described herein.
[00104] Also, the server 108 may also comprise the same modules or components (i.e. a receiver, a transmitter, an identification module, a classification module, a filtering module, a processor, an action detection module and a memory) as of the electronic circuitry 104 for performing the same functions/operations as described herein.
[00105] FIG. 4 illustrates a block diagram for building and training an artificial intelligence (AI) model for performance analysis of the player 114, in accordance with an example embodiment. As described above in FIG. 3, the one or more sensors 308 may collect sports data and provide the collected sports data to a feature extraction 402. This feature extraction 402 may perform the same functions and/or operation/s (such filtering the collected sports data received from the sensors 308) as performed by the filtering module 314. The feature extraction 402 may filter the collected sports data to remove any noise and process the collected sports data as needed. The filtered sports data may then be transmitted to a feature extraction 402.
[00106] Signal processing and time-series analysis is used to filter the collected sports data. That is, time-series techniques such as signal processing, time-series analysis, and statistical modelling are used to extract relevant features from the collected sports data to describe the motion pattern and frequency components. Also, time-series analysis techniques are used to extract features from the collected sports data, such as trend analysis, seasonal decomposition, and autocorrelation from the collected sports data.
[00107] In addition, signal processing techniques are used to extract features from the collected sports data, such as filtering, Fourier transforms, or wavelet transforms. These techniques are used to extract features such as frequency, amplitude, phase, and energy from the collected sports data.
[00108] Based on the filtered sports data, a performance analysis 406 is performed to determine performance of the player 114. The performance of the player 114 may include, but is not limited to, a batting session level summary from the collected/filtered sports data from the sensor 308. Such batting session level summary may include, but is not limited to, maximum hand speed, minimum hand speed, average hand speed, intensity which is shots per hour, total time, timing of the shot, step counts, shot type, scoring potential per shot, out potential per shot , hit or miss, a total distance which includes running/walking, heart rate during the session, defensive vs attacking shots percentage and/or session quality.

Benchmarked Data Speed Speed at Impact BackLift Downswing Efficiency
Cover Drive 42 42 170 137 100
Sweep 70 63 153 194 90
Cut 48 45 147 144 91
Backfoot Punch 74 68 174 176 90
Step Out over covers 56 79 186 74 97
Step out over straight 71 65 168 170 91
Table 1

Benchmarked Data Speed Speed at Impact BackLift Downswing Efficiency
Cover Drive 40 38 172 140 98
Comparison with Benchmark 95 90 100 100 98
Shot Quality 97
Table 2

[00109] Table 1 discloses a sample of the data points collected from the sensors. The shots in the table include cover drive, sweep, backfoot punch, step out over covers and step out over straight. The parameters tracked for each shot are speed in m/s, speed at impact in m/s, backlift angle in degrees, down swing in degrees and efficiency in percentage. A person having skilled in the art would understand that the tracked parameters are not limited to the mentioned parameters and the units disclosed are not limited to the mentioned units.
[00110] Table 2 discloses a sample of the data points that are benchmarks which might belong to an elite batting performer or other standard benchmarks. Further, row one of Table 2 discloses the parameters tracked for each shot which are speed in m/s, speed at impact in m/s, backlift angle in degrees, down swing in degrees and efficiency in percentage. The mentioned parameters and units should not be construed to be limiting in any way. Row two of Table 2 discloses the percentage of deviation of the batter in Table 1 and the benchmarked data of the cover drive shot.
[00111] The performance of the player 114 may include, but is not limited to, a batting data points per shot/swing from an audio sensor, an optic sensor etc. Actions may be automatically detected and thus, automatically detect when a shot is played. Tapping of a cricket bat 102 on ground before playing a shot and swing of the bat 102 while running between wickets may be eliminated, thus minimizing false negative and false positive.
[00112] The batting data points may also include, but is not limited to, a hand speed (measured at wrist in kph, mph, meters per sec, or any other metric), shot power/power factor in terms of bat speed both angular and linear (degrees per second and meters per second). In addition, impact detection (hit/miss) which is impact of ball and bat for combination of different types as below that include bat type (standard English willow bat, any wooden bat, plastic bat, toy bat), ball type (leather ball (red, white, pink), bowling machine ball, tennis ball, rubber ball) may be included as batting data points. Further points include, angles which include backlift angle, downswing angle, follow through angle, hand direction at start of backlift, hand direction at start of downswing, impact angle of hand. Additional points may be, identification of shot type that include vertical / horizontal bat shot and defensive/attacking shots), classification of shots (cover drive, straight drive, on drive, cut, pull, hook, sweep, reverse sweep/ paddle sweep, glide / glance), vibration on hand (impact location on the bat such as sweet spot), wrist break, handle grip /equipment gripping strength at stance and at time of shot and shot quality.
[00113] There are several different shots that a batsman can use to hit the ball and score runs. Each shot has its own specific technique and purpose, and skilled batsmen can choose the right shot for each delivery based on the speed, line, and length of the ball and field placement. Field placement in cricket refers to the strategic arrangement of fielders on the field by the team captain and the bowler to defend against the batting team's scoring opportunities. Field placement involves positioning the fielders in different areas of the field to maximize the chances of taking wickets or preventing runs.
[00114] Each fielding position on the cricket field has a specific name, such as slips, gully, point, cover, mid-off, mid-on, square leg, fine leg, and third man. The captain and bowler determine the positioning of the fielders based on the pitch conditions, the bowler's strengths and weaknesses, the opposition's batting style, and the score.
[00115] Here are some of the most common shots used in cricket:
[00116] Defensive shot: The defensive shot is used to block the ball and prevent the batsman from getting out. It involves holding the bat close to the body and using the bat's face to cushion the impact of the ball.
[00117] Forward defensive shot: The forward defensive shot is similar to the defensive shot, but is played with the batsman's front foot moved forward towards the bowler. This helps to provide more power and stability to the shot.
[00118] Backfoot defensive shot: The backfoot defensive shot is played with the batsman's weight on the back foot. This shot is used to defend against short-pitched deliveries that bounce up high.
[00119] Drive: The drive is a classic attacking shot, in which the batsman uses a full swing of the bat to hit the ball along the ground towards the boundary. There are several types of drives, including the straight drive, cover drive, and off-drive.
[00120] Pull shot: The pull shot is used to hit a short delivery that bounces up high. The batsman swings the bat horizontally across the body, hitting the ball towards the leg side.
[00121] Cut shot: The cut shot is used to hit a delivery that is wide outside the off-stump. The batsman cuts the ball with a horizontal bat, hitting it towards the off-side boundary.
[00122] Sweep shot: The sweep shot is played with a horizontal bat and is used to hit a spinning delivery towards the leg-side boundary. It requires the batsman to bend down low and use their wrists to generate power.
[00123] Reverse sweep: The reverse sweep is a variation of the sweep shot, in which the batsman swaps their hand position on the bat and hits the ball towards the off-side boundary.
[00124] Lofted shot: A lofted shot involves hitting the ball high into the air, with the aim of clearing the field and scoring a boundary or a six. Lofted shots include the slog sweep, the lofted drive, and the scoop shot.
[00125] These are just a few of the many different shots used in cricket. Skilled batsmen are able to combine these shots with footwork, timing, and shot selection to score runs and dominate the bowlers.
[00126] The performance of the player 114 may include, but is not limited to, batting data points from vision. The batting data points from vision may include, but is not limited to, action detection, shot type (drives, horizontal bat shots, direction of the hit), body posture (head position at stance, trigger and at impact, front foot/back foot shots, eye focus – how well players is eyeing ball while playing), ball type (full length, short pitch etc.), speed to ball, impact location which is a ball’s hit spot on bat, (bottom, sweet spot, top, edge), impact direction, ball exit velocity, ball exit angle, grounded / in air, predictive distance covered by ball, predictive score, predictive out (catch, hit wicket etc.) chance, predictive interception by fielder depending on the current field setting.
[00127] In an embodiment the invention performs a prediction of the performance of the player 114. The prediction may include but not limited to the runs scored or out on the batting shot. Further, the actual performance of the player 114 on the field is tracked. The predicted performance and the actual performance of the player 114 is fed back to the machine learning model to improve accuracy.
[00128] The performance of the player 114 may also include a batting session level summary from the collected/filtered sports data from vision. Such batting session level summary may include, but is not limited to, wagon wheel, shot type graphs (percentage of vertical/horizontal shots, percentage of offside/onside shots), front foot/backfoot shot percentage, overall body balance, over focus eying the ball, total score potential.
[00129] The performance of the player 114 may also include weekly/monthly/quarterly summary. This included, but is not limited to, total session in week/month, total shots in week/month, total defensive/attacking shots in week/month, total training time in week and month, improvement over time or sliding performance over time, shot timing graph, quality of shot graph.
[00130] The performance of the player 114 may also include feedback. The feedback may include, but is not limited to, auto training plan, preparing for match format (t20 / one day / test match). Further, batting position plan (opener / first down / middle order / tail), batting against the bowling type, auto feedback, shot selection vs ball type are part of the feedback. In addition, feedback may comprise of rating the shots, out chances when played in air, shot selection / recommendation depending on the field, comparison with elites, technique detection (identification) and correction. Some feedback pertains to player tendency which comprise of front foot player, backfoot player, wrist player, top hand player, bottom hand player. Stance detection and correction are part of the feedbacks, some aspects include open stance, side on stance, stance for spin and fast bowlers. Feedbacks may be employed for trigger detection and correction some examples may include early trigger, late trigger, no trigger and trigger recommendation as per bowling type. Most suitable bat recommendation is enabled by feedback, for example most suitable grip recommendation, over training or under training etc..
[00131] The performance of the player 114 may also include bowling data points. The bowling data points may include, but is not limited to, arm speed, arm release angle, wrist rotation/angel at release, run up steps, run up speed (acceleration, steps rhythm/consistency), run up distance, stride length, landing foot and balance.
[00132] The performance of the player 114 may also include bowling parameters from vision. The bowling parameters from vision may include, but is not limited to, action detection, run up stride, run up rhythm/consistency, action type (front-on, midway, sideway-on), bowling arm detection (right arm, left arm) and arm angles (angle of arm from the shoulder at release, angle between upper and lower arm, arm angle at release). Further parameters include ball type (spin: off spin, leg spin, doosra, googly, flipper, spin type: finger spin, wrist spin, fast: medium, fast, outswing, inswing, leg cutter), ball classification (full toss, shot pitch, good length, yorker or outside off, inline, outside leg), ball parameters (ball release parameters - on seam, wobble seam, cross seam, ball speed – at release, after pitching, swing/spin angle – after pitching, landing foot and balance).
[00133] The performance of the player 114 may also include session summary. The session summary may include, but is not limited to, total balls bowled, maximum arm speed, minimum arm speed, average arm speed, intensity (balls per hour), total time, heart rate during the session, heart rate zones – warmup, far burn, cardio, threshold, peak
[00134] The performance of the player 114 may also include feedback. The feedback may include, but is not limited to, training loads, alert zones, progress based on goals, ranking against peer group /leaderboard, auto training plan. The Auto training plan may include, but is not limited to, preparing for t20/one day/test match, bowling position plan (opener/first change/middle overs/ end of innings), bowling against the batting style, auto feedback / alerts, ball length / recommendation depending on the field, comparison with elites.
[00135] The present invention also encompasses wicket keeping (cricket) parameters. The wicket keeping (cricket) parameters may include, but is not limited to, reaction time, hand speed, total catches, throughs, total steps, heat map using GPS and/or other sensor data.
[00136] The present invention also encompasses fielding (cricket) parameters. The wicket fielding (cricket) parameters may include, but is not limited to, running up speed, total distance covered, acceleration, heat-map. total catches, throughs, reaction time for catch, throughout.
[00137] The present invention also encompasses health parameters which may include, but is not limited to, heart rate chart, heart rate zones (warmup, far burn, cardio, threshold, peak), insights and risk heart zone, calories and steps.
[00138] The present invention also encompasses calories parameters of the player 114. This accounts for all those parameters to measure more accurate calories burnt for the wearable device 118.
[00139] The present invention also encompasses audio sensor/s provided on the wearable device 118 or the electronic circuitry 104 to enable different sports and fitness modes (e.g. – i am batting now, stop batting, i am running now). This will enable touchless event trigger and allow the player 114 to focus on activity that he/she are doing. Such audio feedback method may involve using speaker or other sound device to convey shot level, session level stats, offer feedback and guidance to the player 114.
[00140] The present invention also encompasses an AI model for bat/ball impact. Any bat/ball impact is allowed to be identified using only optical sensor data. When used with other sensors data, improve impact detection automatically accuracy. The present invention also encompasses auto detection of the player 114 being a right/left hand batter and bowler. Using one or more sensor data, and identified range of activity over the sample data, auto identify players playing style is done and starting giving them performance data without need to manually giving these details.
[00141] Based on the filtered sports data, an analytics 404 is performed for the player 114. The analytics 404 develops specialized fine-tuned feature extraction for sporting activity using proprietary sports activity algorithms.
[00142] The analytics 404 for the player 114 is provided as an input to an artificial intelligence (AI) model building and training 408. The AI model building and training 408 uses deep learning models. Further, deep learning models such as recurrent neural networks (RNNs), like Gated Recurrent Unit (GRU), Attention-based LSTM (ATT-LSTM), and convolutional neural networks (CNNs) like Inception-v3, MobileNet are used to analyse the filtered sports data 314 and feature extraction 402. Further, RNNs are used for modelling long-term dependencies in the data, while CNNs are valid for detecting patterns in spatiotemporal data. Deep learning methods are also used to pull out spatial and temporal features from the collected sports data and to model sequential data. By identifying and analysing movement patterns of the player 114, activities of the player 114 are accurately identified and classified.
[00143] A recurrent neural network (RNN) is a type of neural network that is designed to handle sequential data. It processes inputs in a sequence, one at a time, while maintaining a "hidden state" that encodes information from previous inputs. The hidden state is updated at each time step based on the current input and the previously hidden state.
[00144] A gated recurrent unit (GRU) is a type of RNN that uses gating mechanisms to control the flow of information in the network. It is similar to the more well-known Long Short-Term Memory (LSTM) network in that it is designed to address the vanishing gradient problem that can occur in standard RNNs. The GRU has fewer parameters than the LSTM, making it faster to train and less prone to overfitting.
[00145] The GRU has two gating mechanisms: an update gate and a reset gate. The update gate determines how much of the previous hidden state should be retained and how much of the new input should be incorporated into the updated hidden state. The reset gate determines how much of the previous hidden state should be forgotten and how much of the new input should be used to create a new candidate hidden state.
[00146] At each time step, the GRU takes in an input vector x(t) and the previous hidden state h(t-1). It then computes the update gate z(t) and the reset gate r(t) based on the input and the previous hidden state. It then uses these gates to update the candidate hidden state h~(t) and the new hidden state h(t) as follows:
z(t) = sigmoid(Wz * [h(t-1), x(t)])
r(t) = sigmoid(Wr * [h(t-1), x(t)])
h~(t) = tanh(W * [r(t) * h(t-1), x(t)])
h(t) = (1 - z(t)) * h(t-1) + z(t) * h~(t)
[00147] Where sigmoid and tanh are activation functions, Wz, Wr, and W are weight matrices that are learned during training, and [h(t-1), x(t)] denotes the concatenation of the previous hidden state and the current input.
[00148] In summary, the GRU uses gating mechanisms to control the flow of information in the network and to selectively update the hidden state at each time step based on the input and the previous hidden state. This allows the network to capture long-term dependencies in sequential data and to avoid the vanishing gradient problem that can occur in standard RNNs.
[00149] An attention-based LSTM is a type of recurrent neural network (RNN) that uses an attention mechanism to selectively focus on specific parts of the input sequence. It is similar to a regular LSTM, but it adds an additional component that allows the network to selectively attend to different parts of the input sequence, based on their relevance to the task at hand.
[00150] The basic idea behind attention is to give the network the ability to focus on the most relevant parts of the input sequence for a particular task. This is done by introducing a set of attention weights, which are used to compute a weighted sum of the input sequence, where the weights indicate the relative importance of each input element. The attention weights are learned during training and can be thought of as a form of soft attention, in which the network can attend to multiple parts of the input sequence at once, and the attention weights can vary from one time step to the next.
[00151] In an attention-based LSTM, the attention mechanism is added to the LSTM architecture in the following way:
[00152] The input sequence is fed into the LSTM layer, which produces a sequence of hidden states. The last hidden state of the LSTM layer is used as a context vector, which summarizes the information in the input sequence. The attention mechanism computes a set of attention weights, which are used to compute a weighted sum of the input sequence.
[00153] The context vector and the weighted sum of the input sequence are concatenated, and fed into a final output layer, which produces the output for the task.
[00154] The attention mechanism can be implemented in various ways, but one common approach is to use a feedforward neural network to compute the attention weights. The input to the feedforward network is a concatenation of the current hidden state of the LSTM and the input at the current time step, and the output is a scalar value that represents the relevance of the input at that time step. The attention weights are then computed using a SoftMax function, which ensures that the weights sum to one.
[00155] In summary, an attention-based LSTM is a type of RNN that uses an attention mechanism to selectively focus on different parts of the input sequence, based on their relevance to the task at hand. This allows the network to capture long-term dependencies in the input sequence, and to make more accurate predictions by attending to the most relevant information.
[00156] A convolutional neural network (CNN) is a type of artificial neural network that is commonly used for analyzing visual imagery. CNNs are designed to recognize patterns and features in images by using layers of filters, known as convolutional layers, to extract increasingly complex features from the input image.
[00157] The basic architecture of a CNN consists of a series of convolutional layers, followed by one or more fully connected layers, and an output layer. The convolutional layers apply a set of learnable filters to the input image, producing a set of feature maps that highlight the presence of certain patterns or features in the image. The fully connected layers combine these features to make a prediction about the class or category of the image.
[00158] There are several types of CNNs that are commonly used in computer vision applications:
[00159] LeNet: LeNet was one of the earliest CNNs and was introduced in the 1990s by Yann LeCun. It was designed to recognize handwritten digits and was used for automated check reading.
[00160] ResNet: ResNet is known for its ability to train very deep CNNs (up to hundreds of layers) without suffering from the vanishing gradient problem. It uses residual connections to allow the network to learn residual functions, which can be thought of as the difference between the input and output of a block of layers.
[00161] Inception: The Inception network is known for its use of "inception modules", which consist of multiple parallel convolutional layers with different filter sizes. This allows the network to capture features at different scales and resolutions.
[00162] MobileNet is a convolutional neural network (CNN) that was specifically designed for mobile and embedded devices with limited computational resources. It was introduced as a way to make deep learning models more efficient and practical for deployment on mobile devices.
[00163] The main innovation of MobileNet is the use of depthwise separable convolutions, which are a form of convolutional operation that factorizes the standard convolution into a depthwise convolution and a pointwise convolution. This reduces the number of parameters and the computational cost of the model, while still allowing it to achieve good accuracy on image classification tasks.
[00164] These are just a few examples of the many different types of CNNs that have been developed over the years. Each type of CNN has its own strengths and weaknesses, and the choice of which one to use depends on the specific application and the available resources.
[00165] For training the AI model, the deep learning model are trained using the extracted features or filtered sports data as an input and the desired motion pattern as an output. Taking as an example here, the sports data is monitored continuously from one or more sensors 308. When a particular pre-defined threshold limit is reached with regard to data from the one or more sensors 308, collected sports data and filtered sports data is sampled for pattern recognition for building an AI model by the AI model building and training 408.
[00166] For instance, batting threshold from accelerometer may be set at 8g and batting threshold from gyroscope may be set at 800 dps. Similarly, bowling threshold from accelerometer may be set at 10g and bowling threshold from gyroscope may be set at 1400 dps. One such sports data is received at desired threshold, the sports data is processed for pattern recognition.
[00167] In the case of batting of the player 114, the analytics 404 checks the sports data above the threshold limit around the threshold index (e.g., index 50). The pattern identifies the acceleration on each axis and deacceleration after a peak value from the accelerometer sports data. Similarly, an increase and a decrease in dps of gyroscope data is also checked. Both the data are matched to confirm that the peaks on both the sensors 308 were detected at same time using timestamp. Start of acceleration and dip in acceleration defines the start of the shot and end of the shot. Once the pattern is recognized as shot. Start of shot index to the end of shot index data is passed to a speed calculation algorithm. Linear velocity is calculated using accelerometer data and angular velocity is calculated using gyroscope data.
[00168] Incase, only one sensor is used, then the pattern is matched only for that sensor, for example acceleration and de acceleration in case of accelerometer data. Linear velocity is calculated using the accelerometer data and converted to predict the angular velocity. In case if only gyroscope is used then analytics 404 checks the increase in rotation speed on gyroscope data around the peak to find the pattern that shot is played and calculate angular velocity. Linear velocity is predicted.
[00169] Furthermore, when an action is detected, the collected or filtered sports data is processed to check if the ball has hit the bat (impact of ball on bat). As the hand/arm motion during the shot or while bowling has consistent acceleration and deacceleration, sudden dip and spike happens when bat hits the ball. Similarly in case of gyroscope data, sudden dip may be observed rotational data. The dip and spike threshold limit may be checked to match with the recognized dip values when the ball hits the bat. This dip has to be within the start of shot and end of the shot data.
[00170] The determined performance by the performance analysis 406 as well as an output of the AI model building and training 408 are evaluated by an evaluation 410. Such evaluation involves evaluation of the trained AI model's performance using a set of test data and making adjustments as necessary. Moreover, outcomes of the AI model are evaluated and checked if calibration is needed for pattern recognition.
[00171] Based on the evaluation 410, the AI model is deployed by model deployment 412. For this, use the trained and evaluated model in a real-time system to identify motion patterns in new data from one or more sensors 308. That is, based on the new data from the sensors 308, the AI model is used for performance analysis of the player 114.
[00172] FIG. 5 illustrates a block diagram of deep learning algorithms utilized for building and training the AI model, in accordance with an example embodiment. As shown in Fig, 5, deep learning models such as recurrent neural networks (RNNs) 502 are used. One example of RNN is Gated Recurrent Unit (GRU). Attention-based LSTM (ATT-LSTM) 504 is also utilized as a deep learning algorithm for training the AI model. And, convolutional neural networks (CNNs) 506 504 is also utilized as deep learning algorithm for training the AI model. Examples of CNN may include Inception-v3, MobileNet which are used to analyse the filtered sports data 314 and feature extraction 402. Further, RNNs are used for modelling long-term dependencies in the data, while CNNs are valid for detecting patterns in spatiotemporal data. Deep learning methods are also used to pull out spatial and temporal features from the collected sports data and to model sequential data. By identifying and analysing movement patterns of the player 114, activities of the player 114 are accurately identified and classified.
[00173] FIG. 6 illustrates a flow diagram of a method 600 for analysing sports data, in accordance with an example embodiment. The method flow diagram 600 starts at step 602.
[00174] At step 604, sports data of a player 114 is collected by using one or more sensors 308. As explained in FIG. 1 above, the sports data may include any activity or event related to the player 114, such as running, batting, vocal/audio communication, breathing, heart rate, drinking, and/or any such activity of the player 114. The one or more sensors 308 may be comprise at least one of accelerometers, gyroscopes, magnetometers, piezoelectric sensors, electromagnetic trackers, optical sensors, an imaging sensor, heart-rate sensor, microphone, LiDAR (Light Detection and Ranging) sensors and any such that is obvious to a person skilled in the art. For an example, the accelerometer may sense speed at which the player 114 is running or batting.
[00175] At step 606, the collected sports data is filtered. Signal processing and time-series analysis is used to filter the collected sports data. That is, time-series techniques such as signal processing, time-series analysis, and statistical modelling are used to extract relevant features from the collected sports data to describe the motion pattern and frequency components. Also, time-series analysis techniques are used to extract features from the collected sports data, such as trend analysis, seasonal decomposition, and autocorrelation from the collected sports data.
[00176] In addition, signal processing techniques are used to extract features from the collected sports data, such as filtering, Fourier transforms, or wavelet transforms. These techniques are used to extract features such as frequency, amplitude, phase, and energy from the collected sports data.
[00177] At step 608, type of shot is identified from the collected sport data. In some exemplary embodiments, the type of shot comprises at least a vertical shot, a horizontal shot, a defensive shot or an attacking shot. For identifying the type of shot, the identification module 310. For this, such shots may be pre-defined (in terms of their direction, angle etc.) by a technician and then, the collected sports data may be analyzed to identify the type of shot. For instance, the “vertical shot” may be a shot when the bat 102 is perpendicular to the cricket ground when the bat 102 hits a ball. Similarly, the “horizontal shot” may be a shot when the bat 102 is on the same axis as to the cricket ground when the bat 102 hits a ball. Also, the “defensive shot” may be a shot when the player 112 pulls the bat 102 closer to himself when the ball hits. And, the “attacking shot” may be a shot when the player 112 pushes the bat 102 away from himself when the ball hits.
[00178] At step 610, classification of the shot is performed. In some exemplary embodiments, the classification comprises at least a drive, a cut, a pull, a hook, a sweep, a reverse sweep, or a paddle sweep. For performing the classification of the identified shot, the classification module 312. Definition (such as angle, speed, strike rate etc.) for each of these drive, cut, pull, hook, sweep, reverse sweep, or paddle sweep may be pre-fed into the electronic circuitry 104 for classifying these shots. Based on the type of the identified shot, each shot is classified.
[00179] Herein, the classification further comprising using at least a deep learning model. In an exemplary embodiment, the deep learning model comprises at least of a recurrent neural network using at least a gated recurrent unit, attention-based long short-term memory, and convolutional neural network.
[00180] The method flow diagram 600 ends at step 612.
[00181] Accordingly, blocks of the flow diagram support combinations of means for performing the specified functions and combinations of operations for performing the specified functions for performing the specified functions. It will also be understood that one or more blocks of the flow diagram, and combinations of blocks in the flow diagram, may be implemented by special purpose hardware-based computer systems which perform the specified functions, or combinations of special purpose hardware and computer instructions.
[00182] Many modifications and other embodiments of the inventions set forth herein will come to mind to one skilled in the art to which these inventions pertain having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is to be understood that the inventions are not to be limited to the specific embodiments disclosed and that modifications and other embodiments are intended to be included within the scope of the appended claims. Moreover, although the foregoing descriptions and the associated drawings describe example embodiments in the context of certain example combinations of elements and/or functions, it should be appreciated that different combinations of elements and/or functions may be provided by alternative embodiments without departing from the scope of the appended claims. In this regard, for example, different combinations of elements and/or functions than those explicitly described above are also contemplated as may be set forth in some of the appended claims. Although specific terms are employed herein, they are used in a generic and descriptive sense only and not for purposes of limitation.
, C , Claims:WE Claims:
1. A system for analysing sports data, the system comprising:
one or more sensors to collect sports data;
a filtering module to filter the collected sports data;
an identification module to identify type of shot from the collected sport data wherein the type of shot comprises at least a vertical, horizontal, defensive or attacking; and
a classification module to perform classification of the shot wherein the classification comprises at least a drive, cut, pull, hook, sweep, reverse sweep, or paddle sweep.

2. The system of claim 1, wherein one or more sensors comprise at least one of accelerometers, gyroscopes, magnetometers, piezoelectric sensors, electromagnetic trackers, and optical sensors.

3. The system of claim 1, wherein the one or more sensors are attached to a wearable device wherein the wearable device comprises at least of a smart watch, smart band, and smart clothing.

4. The system of claim 1, wherein signal processing and time-series analysis are used to filter the collected sport data.

5. The system of claim 1, wherein the classification further comprising using at least a deep learning model.

6. The system of claim 5, wherein the deep learning model comprises at least of a recurrent neural network using at least a gated recurrent unit, attention-based long short-term memory, and convolutional neural network.

7. The system of claim 1, further comprising an action detection module to:
detect actions from the collected sports data; and
discard a wrong action.

8. The system of claim 7, wherein the wrong action comprises at least of a tapping and running with a bat.

9. The system of claim 2, further comprising detecting an action from the collected data using the optical sensors wherein the action detection comprises at least of a posture detection of a player, motion of at least a sports equipment, three dimensional co-ordinates of a body part of the player from the posture, eye focus of the player and hand-eye coordination of the player.

10. A method for analysing sports data, the method comprising:
collecting sports data by using one or more sensors;
filtering the collected sports data;
identifying type of shot from the collected sport data wherein the type of shot comprises at least a vertical, horizontal, defensive or attacking; and
performing classification of the shot wherein the classification comprises at least a drive, cut, pull, hook, sweep, reverse sweep, or paddle sweep.

11. The method of claim 10, wherein one or more sensors comprise at least one of accelerometers, gyroscopes, magnetometers, piezoelectric sensors, electromagnetic trackers, and optical sensors.

12. The method of claim 10, wherein the one or more sensors are attached to a wearable device wherein the wearable device comprises at least of a smart watch, smart band, and smart clothing.

13. The method of claim 10, wherein the filtering the collected sports data further comprises using signal processing and time-series analysis.

14. The method of claim 10, wherein the classification further comprising using at least a deep learning model.

15. The method of claim 14, wherein the deep learning model comprises at least of a recurrent neural network using at least a gated recurrent unit, attention-based long short-term memory, and convolutional neural network.

16. The method of claim 10, further comprising:
detecting actions from the collected sports data; and
discarding a wrong action.

17. The method of claim 16, wherein the wrong action comprises at least of a tapping, and running with a bat.

18. The method of claim 11, further comprising detecting an action from the collected data using the optical sensors wherein the action detection comprises at least of a posture detection of a player, motion of at least a sports equipment, three dimensional co-ordinates of a body part of the player from the posture, eye focus of the player and hand-eye coordination of the player.

Documents

Application Documents

# Name Date
1 202341015021-FORM FOR STARTUP [06-03-2023(online)].pdf 2023-03-06
2 202341015021-FORM FOR SMALL ENTITY(FORM-28) [06-03-2023(online)].pdf 2023-03-06
3 202341015021-FORM 1 [06-03-2023(online)].pdf 2023-03-06
4 202341015021-EVIDENCE FOR REGISTRATION UNDER SSI(FORM-28) [06-03-2023(online)].pdf 2023-03-06
5 202341015021-EVIDENCE FOR REGISTRATION UNDER SSI [06-03-2023(online)].pdf 2023-03-06
6 202341015021-DRAWINGS [06-03-2023(online)].pdf 2023-03-06
7 202341015021-COMPLETE SPECIFICATION [06-03-2023(online)].pdf 2023-03-06
8 202341015021-STARTUP [07-03-2023(online)].pdf 2023-03-07
9 202341015021-FORM28 [07-03-2023(online)].pdf 2023-03-07
10 202341015021-FORM-9 [07-03-2023(online)].pdf 2023-03-07
11 202341015021-FORM 3 [07-03-2023(online)].pdf 2023-03-07
12 202341015021-FORM 18A [07-03-2023(online)].pdf 2023-03-07
13 202341015021-ENDORSEMENT BY INVENTORS [07-03-2023(online)].pdf 2023-03-07
14 202341015021-Proof of Right [09-03-2023(online)].pdf 2023-03-09
15 202341015021-FORM-26 [09-03-2023(online)].pdf 2023-03-09
16 202341015021-FER.pdf 2023-05-03
17 202341015021-FER_SER_REPLY [06-06-2023(online)].pdf 2023-06-06
18 202341015021-CORRESPONDENCE [06-06-2023(online)].pdf 2023-06-06
19 202341015021-CLAIMS [06-06-2023(online)].pdf 2023-06-06
20 202341015021-US(14)-HearingNotice-(HearingDate-25-08-2023).pdf 2023-08-04
21 202341015021-FORM-26 [22-08-2023(online)].pdf 2023-08-22
22 202341015021-Correspondence to notify the Controller [22-08-2023(online)].pdf 2023-08-22
23 202341015021-Written submissions and relevant documents [08-09-2023(online)].pdf 2023-09-08
24 202341015021-Annexure [08-09-2023(online)].pdf 2023-09-08

Search Strategy

1 cricketE_02-05-2023.pdf